O(log n) O(log n) O(n) O(n) O(n) B-Tree: O(log n) O(log n) O(log n) O(log n) O(log n) O(log n) Red-Black tree: O(log n) O(log n) O(log n) O(log n). Big-O Cheat Sheet Download PDF. Know Thy Complexities! This webpage covers the space and time Big-O complexities of common algorithms used in Computer Science. When preparing for technical interviews in the past, I found myself spending hours crawling the internet putting together the best, average, and worst case complexities for. Big-O Cheat Sheet for Some Data Structures and Algorithms. Big-O Complexity Chart Horrible Bad Fair Good Excellent O(log n), O(1) O(n) O(n log n) O(n^2) O(n!)O(2^n) O p e r a t i o n s Elements. Common Data Structure Operations Data Structure Time Complexity Space Complexity Average Worst Worst. Big-O Algorithm Complexity Cheat Sheet Created Date.
Common Data Structure Operations
Data Structure | Time Complexity | Space Complexity | |||||||
---|---|---|---|---|---|---|---|---|---|
Average | Worst | Worst | |||||||
Access | Search | Insertion | Deletion | Access | Search | Insertion | Deletion | ||
Array | Θ(1) | Θ(n) | Θ(n) | Θ(n) | O(1) | O(n) | O(n) | O(n) | O(n) |
Stack | Θ(n) | Θ(n) | Θ(1) | Θ(1) | O(n) | O(n) | O(1) | O(1) | O(n) |
Queue | Θ(n) | Θ(n) | Θ(1) | Θ(1) | O(n) | O(n) | O(1) | O(1) | O(n) |
Singly-Linked List | Θ(n) | Θ(n) | Θ(1) | Θ(1) | O(n) | O(n) | O(1) | O(1) | O(n) |
Doubly-Linked List | Θ(n) | Θ(n) | Θ(1) | Θ(1) | O(n) | O(n) | O(1) | O(1) | O(n) |
Skip List | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(n) | O(n) | O(n) | O(n) | O(n log(n)) |
Hash Table | N/A | Θ(1) | Θ(1) | Θ(1) | N/A | O(n) | O(n) | O(n) | O(n) |
Binary Search Tree | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(n) | O(n) | O(n) | O(n) | O(n) |
Cartesian Tree | N/A | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | N/A | O(n) | O(n) | O(n) | O(n) |
B-Tree | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(n) |
Red-Black Tree | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(n) |
Splay Tree | N/A | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | N/A | O(log(n)) | O(log(n)) | O(log(n)) | O(n) |
AVL Tree | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(log(n)) | O(n) |
KD Tree | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | Θ(log(n)) | O(n) | O(n) | O(n) | O(n) | O(n) |
Array Sorting Algorithms
Algorithm | Time Complexity | Space Complexity | ||
---|---|---|---|---|
Best | Average | Worst | Worst | |
Quicksort | Ω(n log(n)) | Θ(n log(n)) | O(n^2) | O(log(n)) |
Mergesort | Ω(n log(n)) | Θ(n log(n)) | O(n log(n)) | O(n) |
Timsort | Ω(n) | Θ(n log(n)) | O(n log(n)) | O(n) |
Heapsort | Ω(n log(n)) | Θ(n log(n)) | O(n log(n)) | O(1) |
Bubble Sort | Ω(n) | Θ(n^2) | O(n^2) | O(1) |
Insertion Sort | Ω(n) | Θ(n^2) | O(n^2) | O(1) |
Selection Sort | Ω(n^2) | Θ(n^2) | O(n^2) | O(1) |
Tree Sort | Ω(n log(n)) | Θ(n log(n)) | O(n^2) | O(n) |
Shell Sort | Ω(n log(n)) | Θ(n(log(n))^2) | O(n(log(n))^2) | O(1) |
Bucket Sort | Ω(n+k) | Θ(n+k) | O(n^2) | O(n) |
Radix Sort | Ω(nk) | Θ(nk) | O(nk) | O(n+k) |
Counting Sort | Ω(n+k) | Θ(n+k) | O(n+k) | O(k) |
Cubesort | Ω(n) | Θ(n log(n)) | O(n log(n)) | O(n) |
We can determine complexity based on the type of statements used by a program. The following examples are in java but can be easily followed if you have basic programming experience and use big O notation we will explain later why big O notation is commonly used:
Constant time: O(1)
The following operations take constant time:
- Assigning a value to some variable
- Inserting an element in an array
- Determining if a binary number is even or odd.
- Retrieving element i from an array
- Retrieving a value from a hash table(dictionary) with a key
They take constant time because they are 'simple' statements. In this case we say the statement time is O(1)
As you can see in the graph below constant time is indifferent of input size. Declaring a variable, inserting an element in a stack, inserting an element into an unsorted linked list all these statements take constant time.
Linear time: O(n)
The next loop executes N times, if we assume the statement inside the loop is O(1), then the total time for the loop is N*O(1), which equals O(N) also known as linear time:
In the following graph we can see how running time increases linearly in relation to the number of elements n:
More examples of linear time are:
- Finding an item in an unsorted collection or a unbalanced tree (worst case)
- Sorting an array via bubble sort
C++ Big O Notation Cheat Sheet
Quadratic time: O(n2)
In this example the first loop executes N times. For each time the outer loop executes, the inner loop executes N times. Therefore, the statement in the nested loop executes a total of N * N times. Here the complexity is O(N*N) which equals O(N2). This should be avoided as this complexity grows in quadratic time
Some extra examples of quadratic time are:
- Performing linear search in a matrix
- Time complexity of quicksort, which is highly improbable as we will see in the Algorithms section of this website.
- Insertion sort
N In Big O Notation Cheat Sheet
Algorithms that scale in quadratic time are better to be avoided. Once the input size reaches n=100,000 element it can take 10 seconds to complete. For an input size of n=1’000,000 it can take ~16 min to complete; and for an input size of n=10’000,000 it could take ~1.1 days to complete...you get the idea.
Logarithmic time: O(Log n)
Logarithmic time grows slower as N grows. An easy way to check if a loop is log n is to see if the counting variable (in this case: i) doubles instead of incrementing by 1. In the following example int i
doesn’t increase by 1 (i++), it doubles with each run thus traversing the loop in log(n) time:
Some common examples of logarithmic time are:
- Binary search
- Insert or delete an element into a heap
Don't feel intimidated by logarithms. Just remember that logarithms are the inverse operation of exponentiating something. Logarithms appear when things are constantly halved or doubled.
Logarithmic algorithms have excellent performance in large data sets:
Linearithmic time: O(n*Log n)
Linearithmic algorithms are capable of good performance with very large data sets. Some examples of linearithmic algorithms are:
- heapsort
- merge sort
- Quick sort
We'll see a custom implementation of Merge and Quicksort in the algorithms section. But for now the following example helps us illustrate our point:
Conclusion
As you might have noticed, Big O notation describes the worst case possible. When you loop through an array in order to find if it contains X item the worst case is that it’s at the end or that it’s not even present on the list. Making you iterate through all n items, thus O(n). The best case would be for the item we search to be at the beginning so every time we loop it takes constant time to search but this is highly uncommon and becomes more improbable as the list of items increases. In the next section we'll look deeper into why big O focuses on worst case analysis.
Big O Notation Examples
A comparison of the first four complexities, might let you understand why for large data sets we should avoid quadratic time and strive towards logarithmic or linearithmic time: