This document provides a summary of time and space complexities for various algorithms and data structures. It includes analysis of search algorithms like depth-first search (DFS), breadth-first search (BFS), and Dijkstra's algorithm. It also summarizes sorting algorithms like quicksort, mergesort, heapsort, and others. Common data structures like arrays, linked lists, hash tables, binary search trees, and graphs are analyzed. Asymptotic notation for describing algorithm growth rates such as Big-O, Omega, and Theta notation is also defined.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0 ratings0% found this document useful (0 votes)
389 views6 pages
Big-O Algorithm Complexity Cheat Sheet
This document provides a summary of time and space complexities for various algorithms and data structures. It includes analysis of search algorithms like depth-first search (DFS), breadth-first search (BFS), and Dijkstra's algorithm. It also summarizes sorting algorithms like quicksort, mergesort, heapsort, and others. Common data structures like arrays, linked lists, hash tables, binary search trees, and graphs are analyzed. Asymptotic notation for describing algorithm growth rates such as Big-O, Omega, and Theta notation is also defined.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6
Complexities!
Good Fair Poor
Searching Algorithm Data Structure Time Complexity Space Complexity
Average Worst Worst Depth First Search (DFS) Graph of |V| vertices and |E| edges - O(|E| + |V|) O(|V|) Breadth First Search (BFS) Graph of |V| vertices and |E| edges - O(|E| + |V|) O(|V|) Binary search Sorted array of n elements O(log(n)) O(log(n)) O(1) Linear (Brute Force) Array O(n) O(n) O(1) Shortest path by Dijkstra, using a Min-heap as priority queue Graph with |V| vertices and |E| edges O((|V| + |E|) log |V|) O((|V| + |E|) log |V|) O(|V|) Shortest path by Dijkstra, using an unsorted array as priority queue Graph with |V| vertices and |E| edges O(|V|^2) O(|V|^2) O(|V|) Shortest path by Bellman-Ford Graph with |V| vertices and |E| edges O(|V||E|) O(|V||E|) O(|V|)
Sorting Algorithm Data Structure Time Complexity Worst Case Auxiliary Space Complexity
Notation for asymptotic growth letter bound growth (theta) upper and lower, tight [1] equal [2]
(big-oh) O upper, tightness unknown less than or equal [3]
(small-oh) o upper, not tight less than (big omega) lower, tightness unknown greater than or equal (small omega) lower, not tight greater than [1] Big O is the upper bound, while Omega is the lower bound. Theta requires both Big O and Omega, so that's why it's referred to as a tight bound (it must be both the upper and lower bound). For example, an algorithm taking Omega(n log n) takes at least n log n time but has no upper limit. An algorithm taking Theta(n log n) is far preferential since it takes AT LEAST n log n (Omega n log n) and NO MORE THAN n log n (Big O n log n). SO
[2] f(x)=(g(n)) means f (the running time of the algorithm) grows exactly like g when n (input size) gets larger. In other words, the growth rate of f(x) is asymptotically proportional to g(n). [3] Same thing. Here the growth rate is no faster than g(n). big-oh is the most useful because represents the worst-case behavior. In short, if algorithm is __ then its performance is __ algorithm performance o(n) < n O(n) n (n) = n (n) n (n) > n