Time complexity: O(2^n) with n the number of items
Space complexity: O(n)
Time and space complexity: O(n * c) with n the number items and c the capacity
Time and space complexity: O(n * c) with n the number of items and c the capacity
Space complexity could even be improved to O(2*c) = O(c) as we need to store only the last 2 lines (using row%2):
int[][] dp = new int[2][c + 1];
How much of a resource (time or memory) it takes to execute per operation on average
Access: O(1)
Search: O(n)
Insert: O(n)
Delete: O(n)
All: O(log n)
Time: O(v + e) with v the number of vertices and e the number of edges
Space: O(v)
BFS: time O(v), space O(v)
DFS: time O(v), space O(h) (height of the tree)
Upper bound
Lower bound (fastest)
Theta(n) if both O(n) and Omega(n)
Insert: O(log (n))
Get min (max): O(1)
Delete min: O(log n)
If not balanced O(n)
If balanced O(log n)
Find inorder successor and swap it
Average: O(log n)
Worst: O(h) if not self-balanced BST, otherwise O(log n)
Time: O(n²)
Space: O(1)
Stable
Time: O(branches^depth) with branches the number of times each recursive call branches (english: 2 power 3)
Space: O(depth) to store the call stack
Time and space: O(n * l) with n the number of words and l the longest word length
Time: O(k) with k the size of the key
Space: O(1) iterative, O(k) recursive
Time: O(k) with k the size of the key
Space: O(1) iterative or O(k) recursive
Time complexity: O(n + k) // n is the number of elements, k is the range (the maximum element)
Space complexity: O(k)
Stable
Use case: known and small range of possible integers
Access: O(n)
Insert: O(1)
Delete: O(1)
All: amortized O(1), worst O(n)
Time: Theta(n log n)
Space: O(1)
Unstable
Use case: space constrained environment with O(n log n) time guarantee
Yet, not stable and not cache friendly
Time: O(n²)
Space: O(1)
Stable
Use case: partially sorted structure
Access: O(n)
Insert: O(1)
Delete: O(1)
Time: Theta(n log n)
Space: O(n)
Stable
Use case: good worst case time complexity and stable, good with linked list
Time: best and average O(n log n), worst O(n²) if the array is already sorted in ascending or descending order
Space: O(log n) // In-place sorting algorithm
Not stable
Use case: in practice, quicksort is often faster than merge sort due to better locality (not applicable with linked list so in this case we prefer mergesort)
Time complexity: O(nk) // n is the number of elements, k is the maximum number of digits for a number
Space complexity: O(k)
Stable
Use case: if k < log(n) (for example 1M of elements from 0..1000 as 4 < log(1M))
Space impact as each call is added to the call stack
Unless we use tail call recursion
All: O(log n)
Time: Theta(n²)
Space: O(1)
- Linked list with a pointer on the head
Insert: O(1)
Delete: O(1)
- Array
Insert: O(n), amortized time O(1)
Delete: O(1)
O(n)
Time and space: O(v + e)