Sorting Algorithm Cheat Sheet

For coding interviews or computer science classes

A quick reference of the big O costs and core properties of each sorting algorithm.

worst best average space
Selection Sort
worst
best
average
space
The first two items in [1, 2, 7, 8, 4, 3, 6] are sorted. 3 is the smallest item in the unsorted portion ([7, 8, 4, 3, 6]). Swapping 3 with 7, we have [1, 2, 3, 7, 4, 7, 6] and the first three items are sorted.

Selection sort works by repeatedly "selecting" the next-smallest element from the unsorted array and moving it to the front.

Full selection sort reference

Insertion Sort
worst
best
average
space
The list [2, 4, 7, 8, 1, 3, 6] is partially sorted. (1) is compared with its left neighbor (8) and switches places to create [2, 4, 7, 1, 8, 3, 6]. Then (1) is compared with (7).

Insertion sort works by inserting elements from an unsorted array into a sorted subsection of the array, one item at a time.

Full insertion sort reference

Merge Sort
worst
best
average
space
Two small sorted lists ([2, 4, 7, 8] and [1, 3, 6, 9]) are merged into a large sorted list ([1, 2, 3, 4, 6, 7, 8, 9]).

Merge sort works by splitting the input in half, recursively sorting each half, and then merging the sorted halves back together.

Full merge sort reference

Quicksort
worst
best
average
space
An unsorted list [2, 7, 3, 9, 1, 6, 8, 4] is rearranged around a pivot element, 4. The resulting list is [2, 1, 3, 4, 7, 6, 8, 9]. [2, 1, 3] are less than the pivot and appear to the left; [7, 6, 8, 9] are greater than the pivot and appear to the right.

Quicksort works by recursively dividing the input into two smaller arrays around a pivot item: one half has items smaller than the pivot, the other has larger items.

Full quicksort reference

Heapsort
worst
best
average
space
An unsorted input [2, 4, 7, 8, 1, 3, 6] is transformed into a binary heap (8; 2, 7; 4, 1, 3, 6) which is used to create the sorted output [1, 2, 3, 4, 6, 7, 8].

Heapsort is similar to selection sort—we're repeatedly choosing the largest item and moving it to the end of our array. But we use a heap to get the largest item more quickly.

Full heapsort reference

Counting Sort
worst
best
average
space
The input [1, 3, 7, 8, 1, 1, 3] has three (1)'s, two (3)'s, and a (7) and (8). Encoding that in a list of counts, we have [0, 3, 2, 0, 0, 0, 1, 1, 0, 0].

Counting sort works by iterating through the input, counting the number of times each item occurs, and using those counts to compute each item's index in the final, sorted array.

Full counting sort reference

Radix Sort
worst
best
average
space
Sorting [23, 31, 81, 27, 56, 37, 51] on the ones place, we have [31, 51, 81, 23, 56, 27, 37].

Radix sort works by sorting the input numbers one digit at a time.

Full radix sort reference

Which Sorting Algorithm Should I Use?

It depends. Each algorithm comes with its own set of pros and cons.

  • Quicksort is a good default choice. It tends to be fast in practice, and with some small tweaks its dreaded worst-case time complexity becomes very unlikely. A tried and true favorite.
  • Heapsort is a good choice if you can't tolerate a worst-case time complexity of or need low space costs. The Linux kernel uses heapsort instead of quicksort for both of those reasons.
  • Merge sort is a good choice if you want a stable sorting algorithm. Also, merge sort can easily be extended to handle data sets that can't fit in RAM, where the bottleneck cost is reading and writing the input on disk, not comparing and swapping individual items.
  • Radix sort looks fast, with its worst-case time complexity. But, if you're using it to sort binary numbers, then there's a hidden constant factor that's usually 32 or 64 (depending on how many bits your numbers are). That's often way bigger than , meaning radix sort tends to be slow in practice.
  • Counting sort is a good choice in scenarios where there are small number of distinct values to be sorted. This is pretty rare in practice, and counting sort doesn't get much use.

Each sorting algorithm has tradeoffs. You can't have it all.

So you have to know what's important in the problem you're working on. How large is your input? How many distinct values are in your input? How much space overhead is acceptable? Can you afford worst-case runtime?

Once you know what's important, you can pick the sorting algorithm that does it best. Being able to compare different algorithms and weigh their pros and cons is the mark of a strong computer programmer and a definite plus when interviewing.

Get the 7-day crash course!

In this free email course, I'll teach you the right way of thinking for breaking down tricky algorithmic coding questions.

No CS degree necessary.

. . .