Sorting algorithms
By Afshine Amidi and Shervine Amidi
General concepts
In this part, arrays of elements are visually represented as histograms. The height of each bar represents the value of the associated element in the array.
Sorting algorithm
A sorting algorithm takes an unsorted array as input and returns a sorted array as output. is a permutation of such that:
Stability
A sorting algorithm is said to be stable if the order of tied elements is guaranteed to remain the same after sorting the array.
Examples of stable sorting algorithms include merge sort and insertion sort.
The next sections will go through each sorting algorithm into more detail
Basic sort
Bubble sort
Bubble sort is a stable sorting algorithm that has a time complexity of and a space complexity of .
Intuition Compare consecutive elements and swap them if they are not in the correct order.
Algorithm
Compute step: Starting from the beginning of the array, compare the element at position with the element at position .
Case : They are already in the correct order. There is nothing to do.
Case : They are not in the correct order. Swap them.
Repeat this process until reaching the end of the array.
Repeat step: Repeat the compute step until no swap can be done.
We note that:
At the end of the pass, the last elements of the array are guaranteed to be in their final positions.
The algorithm finishes in at most passes. In particular:
If the input array is already sorted, then the algorithm finishes in 1 pass.
If the input array is reverse sorted, then the algorithm finishes in passes.
Insertion sort
Insertion sort is a stable sorting algorithm that has a time complexity of and a space complexity of .
Intuition Incrementally build a sorted subarray by successively inserting the next unsorted element at its correct position.
Algorithm
Initialization: The first element of the unsorted array can be interpreted as a subarray of one element that is already sorted.
Compute step: Starting from position , we want to insert element into the current sorted subarray of size . In order to do that, we compare with its preceding element :
As long as , we iteratively swap with .
This process ends with either verifying or being at position 0 of the array.
At the end of this step, the sorted subarray is of size .
Repeat step: Repeat the compute step until the sorted subarray reaches a size of .
Selection sort
Selection sort is a stable sorting algorithm that has a time complexity of and a space complexity of .
Intuition Incrementally build a sorted subarray by successively inserting the minimum value among the remaining elements.
Algorithm
Initialization:
Find the minimum value of the unsorted array.
Swap it with the element at the beginning of the array. The first element can now be interpreted as a sorted subarray with one element.
Compute step: Starting from , we want to insert a new element into the sorted subarray of size .
Find the minimum value among the remaining elements at positions of the array. By construction, is greater or equal than all elements of the current sorted subarray.
Swap with the element at position of the array.
At the end of this step, the sorted subarray is of size .
Repeat step: Repeat the compute step until the sorted subarray reaches a size of .
Insertion and selection sort are very similar in that they build the sorted array from scratch and add elements one at a time.
Cycle sort
Cycle sort is an unstable sorting algorithm that has a time complexity of and a space complexity of .
Intuition Determine the index of the final position of each element in the sorted array.
Algorithm
Compute step: Starting from , we want to find the final position of element along with those of the other elements impacted by this move.
To do so, we count the number of elements that are smaller than :
The final position of is at index , since we know that there are smaller elements than in the final sorted array.
Case : This is a self-cycle, meaning that the element is already at its correct position. There is nothing to do.
Case : This is the start of a cycle of moves. Place at position (or to the right of any duplicates, if applicable) and keep the replaced value in a temporary variable.
Keep moving elements using this logic until getting back to position .
Repeat step: Repeat the compute step until reaching the end of the array. We can see cycles being formed from the way elements are moved.
This algorithm sorts the array with a minimum amount of rewrites since it only moves elements to their final positions.
Efficient sort
Merge sort
Merge sort is a stable sorting algorithm that has a time complexity of and a space complexity of .
Intuition Build a sorted array by merging two already-sorted arrays.
Algorithm
Divide step: Divide the array into as many subarrays as there are elements. Each resulting subarray has only one element and can be considered as sorted.
Conquer step: For each pair of sorted subarrays, build a sorted array by merging them using the two-pointer technique.
Repeat the process until all subarrays are merged into one final sorted array.
This algorithm is usually implemented recursively.
Heap sort
Heap sort is an unstable sorting algorithm that has a time complexity of and a space complexity of .
Intuition Incrementally build a sorted subarray by successively retrieving the maximum of remaining elements using a max-heap.
Algorithm
Initialization: Build a max-heap from the unsorted array in time. This is done by recursively swapping each parent with their child of highest value.
We can use the same array to represent the max-heap.
Compute step: Starting from , incrementally build a sorted subarray of size that is placed at the end of the array.
Pop an element from the max-heap and place it at position of the array.
By construction, the popped element is greater or equal than the elements of the heap and smaller or equal than the previously popped elements.
Heapify the resulting max-heap of size in time.
Repeat step: Repeat the compute step until the subarray reaches a size of .
Quick sort
Quick sort is an unstable sorting algorithm that has a time complexity of and a space complexity of .
Intuition Recursively choose an element of the array to be the pivot, and put:
Smaller elements to the left of the pivot.
Bigger elements to the right of the pivot.
Algorithm
Compute step:
Pivot isolation step: Choose an element of the array to be the pivot and swap it with the last element of the array.
Partition step: Starting from indices and , we make sure that elements that are smaller or equal than the pivot are sent to the left of the array:
Case : Swap element at index with element at index . Increment by 1.
Case : There is nothing to do.
This process is successively repeated for all indices until the second-to-last position of the array.
Pivot inclusion step: The final position of the left pointer represents the position in the array where its left elements are all array elements that are smaller than or equal to the pivot. We swap the pivot with the element from position .
At the end of this step, the pivot is at its correct and final position.
Recursion step: The compute step is run recursively on the resulting subarrays on the left and right of the pivot. They are then merged back together to form the final sorted array.
We note that the runtime of the algorithm is sensitive to the choice of the pivot and the patterns found in the input array. In general, a time complexity of and a space complexity of can be obtained when the pivot is a good approximation of the median of the array.
The two most common methods to choose the pivot are:
Special sort
Counting sort
Counting sort is a stable sorting algorithm that is efficient for integer arrays with values within a small range . It has a time complexity of and a space complexity of .
Intuition Determine the final position of each element by counting the number of times its associated value appears in the array.
Algorithm
Number of occurrences: This step takes time and space.
Initialization: Initialize an array of length .
Count step: Scan array and increment the counter of each encountered value by 1.
Each count represents the number of times the value appears in array .
Cumulative step: Compute the cumulative sum of array and move each resulting element to the right. This operation is done in-place and takes time and space.
For each value present in , the associated element in the resulting array indicates its starting index in the final sorted array .
Construction of the sorted array: This step takes time and space.
Initialization: Initialize an array of length that will contain the final sorted array.
Main step: For each value of the unsorted array :
Write to index of .
Increment by 1 so that any later duplicates can be handled.
At the end of this process, we obtain the final sorted array .
Radix sort
Radix sort is a stable sorting algorithm that is well suited for integer arrays where elements are written with a limited number of digits , each digit being in the range . This algorithm has a time complexity of and a space complexity of .
Intuition Successively sort elements based on their digits using counting sort.
Algorithm
Compute step: Perform counting sort based on the rightmost digit. This step takes time and space.
Repeat step: Repeat the *compute step on the remaining digits until reaching the leftmost digit.
At the end of this process, the array is sorted.
The trick of this algorithm lies in the stability property of counting sort: the relative ordering based on a given (weak) digit helps in breaking ties of later (stronger) digits.
Subscribe here to be notified of new Super Study Guide releases!