Analysis of Algorithm
Analysis of Algorithm
> An algorithm is an orderly step-by-step procedure, which has the following characteristics:
i. It accepts one or more input value
ii. It returns at least one output value
iii. It terminates after finite steps
> An algorithm may also be viewed as a tool for solving a computational problem
Categories of Algorithms
Sorting Algorithms
Searching Algorithms
String Processing (Pattern matching, Compression, Cryptography)
Image Processing (Compression, Matching, Conversion)
Mathematical Algorithms (Random number generator, matrix operations)
Approaches to Analysis
> Analytical Approach
Basically three approaches can be adopted to analyze algorithm running
(this measure is in terms
time in terms of input size: of how many times a basic
Empirical Approach: operation is carried out
Running time measured experimentally for each value of the input
Analytical Approach: size)
Running time estimated using mathematical modeling
Visualization:
Performance is studied through animation for different data sets
Algorithm Design
There are many approaches to designing algorithms:
Divide-and-Conquer
Greedy
Dynamic Programming
Brute Force
Approximation
Each has certain advantages and limitations
Lec 1 Page 1
of each algorithm design approach:
1. Divide-and-Conquer
• Advantages:
○ Breaks complex problems into simpler sub-problems.
○ Efficient for problems like sorting (e.g., Merge Sort) and searching (e.g., Binary Search).
• Limitations:
○ Overhead of recursive calls.
○ May require extra memory for intermediate results (e.g., Merge Sort).
2. Greedy
• Advantages:
○ Simple and fast.
○ Often optimal for specific problems like Huffman coding.
• Limitations:
○ Does not always provide the global optimum.
○ Works only if the problem has the "greedy-choice property."
3. Dynamic Programming
• Advantages:
○ Provides optimal solutions for problems with overlapping sub-problems (e.g., Fibonacci,
Knapsack).
○ Avoids redundant computations using memorization.
• Limitations:
○ High memory usage.
○ Can be complex to implement.
4. Brute Force
• Advantages:
○ Simple to design and implement.
○ Works for small input sizes.
• Limitations:
○ Inefficient for large input sizes due to high time complexity.
5. Approximation
• Advantages:
○ Provides near-optimal solutions in reasonable time.
○ Useful for NP-hard problems like Traveling Salesman Problem.
• Limitations:
○ May not provide the exact solution.
○ Approximation ratio varies depending on the algorithm.
Lec 1 Page 2
Analysis of Algorithm
Saturday, 7 December 2024 6:03 pm
Asymptotic Analysis
The asymptotic behavior of a function f(n) refers to the growth of f(n) as n gets
large.
We typically ignore small values of n, since we are usually interested in
estimating how slow the program will be on large inputs
.
A good rule of thumb is that the slower the asymptotic growth rate, the better the
algorithm. Though it’s not always true.
Lec 2 Page 3
Big O-Notation
Sunday, 8 December 2024 10:23 am
Lec 2 Page 4
Q/A's
Sunday, 8 December 2024 10:34 am
• Space Complexity: while space complexity measures the amount of memory it uses.
○ Example: Merge Sort uses O(n) additional memory.
Lec 2 Page 5
Sort (average case) more efficient than algorithms like Bubble Sort and Selection Sort. For example:
• If n=1000n = 1000n=1000:
○ O(n log n)=1000×10=10,000O(n \log n) = 1000 \times 10 = 10,000O(n log n)=1000×10=
10,000
○ O(n2)=1,000,000O(n^2) = 1,000,000O(n2)=1,000,000
python
Copy code
for i in range(n): # Runs n times
for j in range(n): # Runs n times for each i
print(i, j)
Lec 2 Page 6
Q/A's 2
Sunday, 8 December 2024 2:31 pm
> The Analysis of Algorithms helps identify the most efficient solution among multiple correct
approaches to a problem.
> Analysis of an algorithm gives insight into how long the program runs and how much memory it
uses
– time complexity
– space complexity
Why is the analysis of algorithms useful?
It helps predict performance, compare efficiency, optimize resources, and ensure scalability for
solving problems effectively
What is asymptotic analysis, and why is it important?
Asymptotic analysis is a mathematical technique used for understanding the
behavior of algorithms as their input increases. It uses asymptotic notations
to describe the growth rate or time complexity of an algorithm, which allows
us to compare different algorithms
• Space Complexity: while space complexity measures the amount of memory it uses.
○ Example: Merge Sort uses O(n) additional memory.
Lec 2 Page 7
2. Omega (Ω\Omega)
3. Theta (Θ\Theta)
Lec 2 Page 8
Divide & Conquer
Sunday, 8 December 2024 1:30 pm
https://www.geeksforgeeks.org/insertion-sort-algorithm/
Divide and Conquer Algorithm involves breaking a larger problem into smaller sub-
problems, solving them independently, and then combining their solutions to solve
the original problem.
Lec 3 Page 9
Q/A's
Sunday, 8 December 2024 2:52 pm
Here are some short conceptual questions and answers related to the Divide and Conquer approach,
perfect for exams:
10. What kind of problems are best suited for Divide and Conquer?
Lec 3 Page 10
10. What kind of problems are best suited for Divide and Conquer?
Problems that can be divided into independent sub-problems, such as sorting, searching, and
optimization problems.
Lec 3 Page 11
Insertion Sort
Sunday, 8 December 2024 3:11 pm
Insertion sort is a simple sorting algorithm that works by iteratively inserting each
element of an unsorted list into its correct position in a sorted portion of the list.
• Best case: O(n) , If the list is already sorted, where n is the number of elements in
the list.
• Average case: O(n 2 ) , If the list is randomly ordered
• Worst case: O(n 2 ) , If the list is in reverse order
• Auxiliary Space: O(1), Insertion sort requires O(1) additional space, making it a
space-efficient sorting algorithm.
Lec 3 Page 12
• Stable sorting algorithm.
• Efficient for small lists and nearly sorted lists.
• Space-efficient as it is an in-place algorithm.
• Adoptive. the number of inversions is directly proportional to number of swaps. For
example, no swapping happens for a sorted array and it takes O(n) time only.
Q1. What are the Boundary Cases of the Insertion Sort algorithm?
Insertion sort takes the maximum time to sort if elements are sorted in reverse
order. And it takes minimum time (Order of n) when elements are already sorted.
Lec 3 Page 13
Q/A's
Sunday, 8 December 2024 3:22 pm
Here are some short conceptual questions and answers related to Insertion Sort:
Lec 3 Page 14
Merge Sort
Sunday, 8 December 2024 5:54 pm
https://www.geeksforgeeks.org/quizzes/top-mcqs-on-mergesort-algorithm-with-answers/
• Time Complexity:
• Best Case: O(n log n), When the array is already sorted or nearly sorted.
• Average Case: O(n log n), When the array is randomly ordered.
• Worst Case: O(n log n), When the array is sorted in reverse order.
• Auxiliary Space: O(n), Additional space is required for the temporary array
used during merging
• Stability:
○ Merge Sort is a stable sorting algorithm because it maintains the relative order of equal
elements.
• T(n) Represents the total time time taken by the algorithm to sort an
array of size n.
• 2T(n/2) represents time taken by the algorithm to recursively sort the two
halves of the array. Since each half has n/2 elements, we have two
recursive calls with input size as (n/2).
• O(n) represents the time taken to merge the two sorted halves
Lec 3 Page 15
Q/A's
Sunday, 8 December 2024 6:07 pm
Lec 3 Page 16
Non-Recursive Algorithm
Sunday, 8 December 2024 9:09 pm
A non-recursive algorithm (also called an iterative algorithm) is one that solves a problem without
using recursion. Instead of calling the function repeatedly, it uses loops (like for or while) to achieve
the same result.
Lec 4 Page 17
Examples
Monday, 9 December 2024 10:24 am
Lec 4 Page 18
Q/A's
Monday, 9 December 2024 11:37 am
Here are short conceptual Q/A's related to the analysis of non-recursive algorithms that might be
useful for exams:
5. What happens if the inner loop depends on the outer loop index?
If the inner loop depends on the outer loop index, the time complexity is the summation of iterations.
Example:
for i in range(n):
for j in range(i):
print("Work")
Time complexity:
∑i=1ni=n(n+1)2=O(n2)\sum_{i=1}^{n} i = \frac{n(n+1)}{2} = O(n^2)
Lec 4 Page 19
7. Can a non-recursive algorithm have exponential time complexity?
Yes, if the number of iterations grows exponentially based on input size.
Example:
for i in range(2**n):
print("Work")
Time complexity is O(2n)O(2^n).
Lec 4 Page 20
Short Questions
Sunday, 15 December 2024 7:40 am
Here’s a comprehensive list of short questions and answers on Design and Analysis of Algorithms:
---
Introduction to Algorithms
1. What is an algorithm?
An algorithm is a step-by-step procedure or formula for solving a problem.
---
Mathematical Foundations
---
13. What is the difference between stable and unstable sorting algorithms?
Stable sorting algorithms preserve the relative order of equal elements, while unstable sorting
algorithms may not.
---
17. How does the Merge Sort algorithm use divide and conquer?
Merge Sort divides the array into halves, recursively sorts each half, and merges them back
---
Greedy Algorithms
---
Dynamic Programming
27. How does dynamic programming differ from divide and conquer?
Dynamic programming solves overlapping subproblems and stores solutions to avoid redundant
work, while divide and conquer solves independent subproblems.
30. How does the Longest Common Subsequence (LCS) algorithm work?
The LCS algorithm uses dynamic programming to find the longest subsequence common to two
sequences by filling a DP table based on matching characters.
---
Graph Algorithms
Backtracking
---
---
NP-Completeness
---
Approximation Algorithms
---
Miscellaneous Topics
---
This collection covers the fundamental topics in the Design and Analysis of Algorithms, with key
questions and answers for quick study.