4 min read
sorting algorithms rants

In mathematics, there are things we call sorting algorithms. in fact, they help us organize data in a particular order, whether ascending or descending. the thing is, it is even common form of technical interview with big tech recently. however, amny developers don’t even know the reason they do this kind of assessment.

I intend to show some of the sorting algorithms that i’ve used and learnt over time. I might write something some day with examples of what to do.

1. Bubble Sort

okay, bubble sort is very common. it’s literally the classic beginner-friendly sorting algorithm. actually, it repeatedly steps through the list, compares adjacent elements, and swaps them if they’re in the wrong order. it’s very easy to understand but then, i can’t suggest using it for large datasets.

Time Complexity: O(n^2) in the worst and average cases. (it compares every element with every other element).
Space Complexity: O(1). it sorts the array in place, so it doesn’t use any extra memory.

well, bubble sort is cool to learn, but in most real-world cases, you’d want to avoid it because it’s just too slow for larger data.

let me show an example of bubble sort algorithm in play:

def bubble_sort(arr):
  n = len(arr)
  for i in range(n):
    for j in range(0, n-i-1):
      if arr[j] < arr[j+1]:
        arr[j], arr[j+1] = arr[j+1], arr[j]

what i did to achieve bubble sort on an array arr is as to:

  • Create an outer loop from i=0 to i = n where n is the array’s length.
  • reate an inner loop from j=0 to n-i-1
  • Swap arr[j] with a[j+1] if arr[j] is less than arr[j+1]

2. Quick Sort

about quick sort, it is the opposite of bubble sort in terms of efficiency tho. in fact, it is actually one of the fastest sorting algorithms i have used. it works by picking a kind of “pivot” and then partitioning the array into two parts – like elements that are less than the pivot go on one side, and elements greater than the pivot go on the other side. it then recursively sorts those parts.

Time Complexity: O(n log n) on average, but O(n^2) in the worst case (when the pivot is poorly chosen).
Space Complexity: O(log n) due to recursive calls, though it sorts in place.

anyways, the quick sort’s average case is quite fast, making it what we use for many sorting tasks. on my own end, i have learnt to appreciate the clever use of recursion and partitioning in quick sort method o!

let me show you an implementation of the quick sort algorithm method:

def quick_sort(arr):
  if len(arr) <= 1:
    return arr
  pivot = arr[0] # did this to set pivot to first element of array 
  left_arr = [el for el in arr (if el <= pivot)]
  right_arr = [el for el in arr(if el > pivot)]
  return quick_sort(left_arr) + pivot + quick_sort(right_arr)

a brief of what i did there is to:

  • Check if the array length is less than or equal to 1, and return the array.
  • Pick the first element of the array and set it as the pivot.
  • Create a left_array and fill it with values from the array that are less than the pivot.
  • Create a right_array and fill it with values from the array that are more than the pivot.
  • Call the quick_sort function recursively on both subarrays.
  • Add the results from both subarrays to the pivot and return the result.

3. Merge Sort

although merge sort is another efficient, divide-and-conquer algorithm, it works by dividing the array into smaller subarrays, sorting those, and then merging them back together.

Time Complexity: O(n log n) in all cases (worst, average, and best).
Space Complexity: O(n), because it requires additional space to hold the divided parts of the array.

even though merge sort guarantees efficiency, the extra space requirement can be a drawback in memory-constrained environments.

let me show you an implementation of a merge sort:

def merge(a1,a2):
  res = []
  i = j = 0
  while i < len(a1) and j < len(a2):
    if a1[i] < a2[j]:
      res.append(a1[i])
      i += 1
    else:
      res.append(a2[j])
      j += 1
  while i < len(a1):
    res.append(a1[i])
    i += 1

  while j < len(a2):
    res.append(a2[j])
    j += 1

  return res

def merge_sort(arr):
  if len(arr) < 2:
    return arr

  mid = len(arr) // 2
  left_arr = arr[:mid]
  right_arr = arr[mid:]
  left_res = merge_sort(left_arr)
  right_res = merge_sort(right_arr)
  return merge(left_res, right_res)

that said, what i did to the algorithm on an array, arr is to:

  • check if the length of arr is less than two, and return arr. The array arr is sorted.
  • split the array into two sides called left_arr and right_arr.
  • recursively split left_arr and right_arr until their lengths are less than two.
  • merge all the sub-arrays formed from steps 2 and 3.

4. Insertion Sort

as for the insertion sort, it builds the sorted list one item at a time. it literally picks an element and inserts it into its correct position in the already-sorted part of the array. you can easily use this for small datasets and some nearly sorted arrays too.

Time Complexity: O(n^2) in the worst case, but O(n) if the array is nearly sorted.
Space Complexity: O(1), since it sorts in place.

i like insertion sort ONLY because it’s intuitive and works well in cases where the data is almost sorted (almost all data are already almost sorted, so).