Algorithm
Lecture: Introduction to Algorithms
What is an Algorithm?
• Definition: An algorithm is a step-by-step set of instructions to solve a problem or
perform a task.
• Example: A recipe to bake a cake, or instructions to solve a math problem.
Key Properties of Algorithms
1. Input: Zero or more inputs are externally supplied.
2. Output: At least one output is produced.
3. Definiteness: Each step is precisely defined.
4. Finiteness: The algorithm must end after a finite number of steps.
5. Effectiveness: Each step must be basic enough to be carried out.
Simple Example
Problem: Add two numbers.
Algorithm:
1. Start
2. Read number A
3. Read number B
4. Add A and B, store in SUM
5. Print SUM
6. End
Types of Algorithms
1. Brute Force – Try all possible solutions
2. Greedy – Make the best choice at each step
3. Divide and Conquer – Divide the problem, solve parts, combine
4. Dynamic Programming – Store results of subproblems to avoid recomputation
5. Backtracking – Try partial solutions and backtrack if needed
6. Recursive Algorithms – Call the function within itself
Complexity of Algorithms
• Time Complexity: How the running time grows with input size (Big O notation)
• Space Complexity: How much memory is used
Common Time Complexities:
Complexity Example Algorithm Description
O(1) Accessing an array element Constant time
O(log n) Binary Search Logarithmic time
O(n) Linear Search Grows linearly
O(n log n) Merge Sort Efficient sorting
O(n²) Bubble Sort Nested loops
O(2ⁿ) Recursive Fibonacci Exponential time
Common Algorithms to Learn
• Searching: Linear Search, Binary Search
• Sorting: Bubble Sort, Selection Sort, Merge Sort, Quick Sort
• Recursion: Factorial, Fibonacci
• Graph Algorithms: DFS, BFS
• Dynamic Programming: Knapsack, Fibonacci with memoization
Absolutely! Here's a detailed lecture on the complexity of algorithms, covering both time and
space complexity with intuitive explanations, examples, and tables — ideal for beginners or
early computer science students.
Lecture: Complexity of Algorithms
1. What is Algorithmic Complexity?
Algorithmic complexity refers to the resources an algorithm consumes, typically:
• Time: How long the algorithm takes to run
• Space: How much memory (RAM) it uses
This helps you:
• Compare different algorithms
• Choose the most efficient one for large inputs
• Understand how your code will scale
2. Time Complexity
What is Time Complexity?
It describes how the runtime of an algorithm increases with the size of the input, usually
represented by n.
We use Big O notation to describe the upper bound (worst-case) behavior.
Common Time Complexities
Complexity Name Example Algorithm Description
O(1) Constant Accessing array element Same time regardless of input size
O(log n) Logarithmic Binary Search Halves the problem at each step
O(n) Linear Linear Search Time grows directly with input size
O(n log n) Linearithmic Merge Sort, Quick Sort Efficient sorting
Complexity Name Example Algorithm Description
O(n²) Quadratic Bubble Sort Nested loops over the input
O(2ⁿ) Exponential Recursive Fibonacci Time doubles with each additional input
O(n!) Factorial Solving permutations Extremely slow growth
Examples
O(1) – Constant Time
def get_first_element(arr):
return arr[0]
→ Always one step, regardless of array size.
O(n) – Linear Time
def find_max(arr):
max_val = arr[0]
for val in arr:
if val > max_val:
max_val = val
return max_val
→ Must check each item once → grows with n.
O(n²) – Quadratic Time
def bubble_sort(arr):
for i in range(len(arr)):
for j in range(len(arr) - i - 1):
if arr[j] > arr[j + 1]:
arr[j], arr[j + 1] = arr[j + 1], arr[j]
→ Each item compared with each other → n × n.
How to Analyze Time Complexity
1. Count the number of basic operations (e.g., additions, comparisons).
2. Focus on the most dominant term (ignore constants and low-order terms).
o For 3n + 2, complexity is O(n)
o For 5n² + 4n + 7, complexity is O(n²)
3. Space Complexity
What is Space Complexity?
It measures how much additional memory an algorithm needs as the input size increases.
Example
def sum_array(arr):
total = 0
for num in arr:
total += num
return total
→ Uses constant space: one variable (total) → O(1)
When Space Matters
• Recursion: Each recursive call adds to the call stack
• Storing auxiliary structures: Hash tables, lists, matrices
• Dynamic Programming: Memoization uses extra space
Example – Fibonacci with memoization (O(n) space)
def fib(n):
memo = [0, 1]
for i in range(2, n+1):
memo.append(memo[i-1] + memo[i-2])
return memo[n]
4. Best, Worst, and Average Case
• Best Case: Minimum time (e.g., searching and finding on the first try)
• Worst Case: Maximum time (used for Big O)
• Average Case: Expected time for random inputs
Binary Search Example:
Case Time Complexity
Best O(1)
Worst O(log n)
Average O(log n)
5. Tradeoffs
Sometimes:
• Faster time needs more space (and vice versa)
• Choosing between algorithm efficiency and implementation simplicity
Algorithm Time Complexity Space Complexity
Merge Sort O(n log n) O(n)
Quick Sort O(n log n) O(log n)
Bubble Sort O(n²) O(1)
Dynamic Fib O(n) O(n)
Recursive Fib O(2ⁿ) O(n)
6. Practice Problems
Try analyzing the time/space complexity of these:
1. Count even numbers in a list
2. Multiply all elements in a 2D matrix
3. Check for duplicates in a list using a set
4. Compute factorial using recursion
5. Sort a list using insertion sort
7. Summary
Aspect Description
Time Complexity Measures how execution time scales
Space Complexity Measures how memory use scales
Big O Notation Describes upper bound (worst-case)
Focus on dominant terms Ignore constants and low-order terms