0% found this document useful (0 votes)
4 views3 pages

Algorithm Quick Reference

Uploaded by

wiremu casey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views3 pages

Algorithm Quick Reference

Uploaded by

wiremu casey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Algorithms & Analysis - Quick Reference Guide

Complexity Hierarchy
O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(n³) < O(2ⁿ) < O(n!)

Asymptotic Notation
Big O (Upper Bound): f(n) ≤ c·g(n) for large n
Big Ω (Lower Bound): f(n) ≥ c·g(n) for large n
Big Θ (Tight Bound): c₁·g(n) ≤ f(n) ≤ c₂·g(n) for large n

Algorithm Complexity Table

Algorithm Best Average Worst Space Notes


Searching
Linear Search O(1) O(n) O(n) O(1) Unsorted data
Binary Search O(1) O(log n) O(log n) O(1) Sorted data required
Sorting
Selection Sort O(n²) O(n²) O(n²) O(1) Min swaps, unstable
Bubble Sort O(n) O(n²) O(n²) O(1) Stable, adaptive
Insertion Sort O(n) O(n²) O(n²) O(1) Stable, online
Merge Sort O(n log n) O(n log n) O(n log n) O(n) Stable, predictable
Quick Sort O(n log n) O(n log n) O(n²) O(log n) In-place, unstable
Heap Sort O(n log n) O(n log n) O(n log n) O(1) In-place, unstable
Graph Traversal
DFS O(V+E) O(V+E) O(V+E) O(V) Uses stack/recursion
BFS O(V+E) O(V+E) O(V+E) O(V) Uses queue, shortest path

Master Theorem
For T(n) = aT(n/b) + f(n):

Case 1: f(n) = O(n^(log_b(a) - ε)), ε > 0


→ T(n) = Θ(n^log_b(a))

Case 2: f(n) = Θ(n^log_b(a))


→ T(n) = Θ(n^log_b(a) · log n)

Case 3: f(n) = Ω(n^(log_b(a) + ε)), ε > 0, regularity condition


→ T(n) = Θ(f(n))

Common Recurrence Patterns


T(n) = T(n-1) + c → O(n) [Linear recursion]
T(n) = T(n/2) + c → O(log n) [Binary search]
T(n) = 2T(n/2) + c → O(n) [Simple divide & conquer]
T(n) = 2T(n/2) + n → O(n log n) [Merge sort]
T(n) = T(n-1) + n → O(n²) [Selection sort recursion]
T(n) = 2T(n-1) + c → O(2ⁿ) [Fibonacci recursion]

Summation Formulas
Σ(i=1 to n) 1 = n
Σ(i=1 to n) i = n(n+1)/2 ≈ n²/2
Σ(i=1 to n) i² = n(n+1)(2n+1)/6 ≈ n³/3
Σ(i=1 to n) i³ = [n(n+1)/2]² ≈ n⁴/4
Σ(i=0 to n) 2ⁱ = 2^(n+1) - 1 ≈ 2ⁿ
Σ(i=1 to n) 1/i ≈ ln(n)

Logarithm Properties
log(1) = 0
log(a) = 1 (base a)
log(xy) = log(x) + log(y)
log(x/y) = log(x) - log(y)
log(xʸ) = y·log(x)
log_a(x) = log_b(x) / log_b(a)
log₂(n) ≈ 3.32 · log₁₀(n)

Graph Representations
Adjacency Matrix
Space: O(V²)
Edge lookup: O(1)
Add vertex: O(V²)
Add edge: O(1)
Best for: Dense graphs, frequent edge queries

Adjacency List

Space: O(V + E)
Edge lookup: O(degree)
Add vertex: O(1)
Add edge: O(1)
Best for: Sparse graphs, traversal algorithms

Algorithm Design Paradigms


Brute Force
Strategy: Try all possibilities
Complexity: Often exponential
Examples: TSP, subset sum, naive string matching
When to use: Small inputs, need optimal solution

Decrease and Conquer


Strategy: Reduce to smaller subproblem
Variants: By constant (insertion sort), by factor (binary search), variable (GCD)
Complexity: Often O(n) or O(log n)
Examples: Binary search, insertion sort, Euclidean algorithm

Divide and Conquer


Strategy: Split into multiple subproblems
Complexity: Often O(n log n)
Examples: Merge sort, quick sort, binary tree operations
Recurrence: T(n) = aT(n/b) + f(n)

Data Structure Operations

Structure Access Search Insert Delete Space


Array O(1) O(n) O(n) O(n) O(n)
Sorted Array O(1) O(log n) O(n) O(n) O(n)
Linked List O(n) O(n) O(1) O(1) O(n)
Stack O(n) O(n) O(1) O(1) O(n)
Queue O(n) O(n) O(1) O(1) O(n)
Hash Table N/A O(1)* O(1)* O(1)* O(n)
Binary Tree O(n) O(n) O(n) O(n) O(n)
BST (balanced) O(log n) O(log n) O(log n) O(log n) O(n)
Heap N/A O(n) O(log n) O(log n) O(n)

*Average case for hash table

Problem-Solving Checklist
Algorithm Analysis
Identify input size parameter
Choose basic operation (innermost, most frequent)
Consider best/average/worst cases
Set up summation or recurrence
Solve using standard techniques
Express in asymptotic notation

Graph Problems
Understand graph type (directed/undirected, weighted/unweighted)
Choose representation (matrix vs list)
Select traversal method (DFS vs BFS)
Consider connectivity and path requirements
Analyze space and time complexity

Sorting Problems
Consider input characteristics (size, order, duplicates)
Determine stability requirements
Evaluate space constraints (in-place vs extra space)
Choose appropriate algorithm
Analyze complexity for given scenario
Common Pitfalls
Asymptotic notation: Confusing O, Ω, Θ meanings
Summation setup: Incorrect loop bounds or nesting
Recurrence solving: Wrong base case or substitution errors
Graph traversal: Using wrong data structure (stack vs queue)
Complexity analysis: Choosing wrong basic operation

Exam Strategy
1. Identify problem type: Analysis, design, tracing, comparison
2. Use systematic approach: Follow standard methods
3. Show all work: Partial credit for correct process
4. Verify with examples: Check answers on small inputs
5. State assumptions: Clarify ambiguous problems
6. Manage time: Don't get stuck on single question

Memory Aids
DFS = Depth First = Stack (LIFO) = Goes Deep
BFS = Breadth First = Queue (FIFO) = Level by Level

Stable sorts: Bubble, Insertion, Merge (BIM)


In-place sorts: Selection, Bubble, Insertion, Heap, Quick

O(log n): Binary operations (search, tree height)


O(n): Single pass through data
O(n log n): Optimal comparison sorting
O(n²): Nested loops, simple sorting
O(2ⁿ): Exponential, subset generation

Quick Complexity Estimates


n = 10³: log n ≈ 10, n log n ≈ 10⁴, n² ≈ 10⁶
n = 10⁶: log n ≈ 20, n log n ≈ 2×10⁷, n² ≈ 10¹²
n = 10⁹: log n ≈ 30, n log n ≈ 3×10¹⁰, n² ≈ 10¹⁸

Practical limits (1 second):


O(1), O(log n): Any reasonable n
O(n): n ≤ 10⁸
O(n log n): n ≤ 10⁷
O(n²): n ≤ 10⁴
O(n³): n ≤ 500
O(2ⁿ): n ≤ 25
O(n!): n ≤ 10

You might also like