10211CS202-Design and Analysis of Algorithms-Scheme of Evaluation
10211CS202-Design and Analysis of Algorithms-Scheme of Evaluation
2. Time complexity: Time complexity measures the amount of time an algorithm takes to run as
a function of input size.
Space complexity: Space complexity measures the amount of memory or space an algorithm
uses as a function of input size.
3. Divide: Break the problem into smaller subproblems of the same type.
Conquer: Solve the subproblems recursively. If subproblems are small enough, solve
them directly.
Combine: Merge the solutions of the subproblems to get the final result.
5. Dynamic Programming
Definition:
Dynamic Programming (DP) is a technique for solving complex problems by breaking them
down into simpler overlapping subproblems, solving each subproblem only once, and storing
their solutions – usually using memoization or tabulation.
Key Features:
Optimal substructure
Overlapping subproblems
Example Problems:
Fibonacci series, 0/1 Knapsack
6.Bi-connected Component
Definition:
A bi-connected component (also called a 2-connected component) of an undirected graph is a
maximal set of edges such that any two vertices in the set remain connected after removing any
one vertex (no articulation point in the component).
Example:
Graph with vertices A, B, C, D and edges:
(A–B), (B–C), (C–A), (B–D)
7: Define Backtracking
Definition:
Backtracking is a general algorithmic technique that involves searching through all possible
solution of a problem and back track as soon as it is determined it cannot lead to a valid
solution.
Key Idea: Build a solution incrementally, and backtrack when a constraint is violated.
Applications: N-Queens, Graph Coloring, Sudoku Solver, Hamiltonian Cycle, Subset Sum.
Part B
11(a): Explain the Asymptotic Notations Big-O (O), Big-Omega (Ω), and Theta (θ) with
Example
1. Big-O Notation (O) – Upper Bound
Describes the worst-case time complexity.
It gives the maximum amount of time an algorithm can take.
Definition:
A function f(n) is O(g(n)) if there exist constants c > 0 and n₀ ≥ 0 such that:
f(n) ≤ c * g(n) for all n ≥ n₀.
Example:
For f(n) = 3n + 5, it is O(n) because 3n + 5 ≤ 4n for all n ≥ 5.
2. Big-Omega Notation (Ω) – Lower Bound
Describes the best-case time complexity.
It provides the minimum time the algorithm takes.
Definition:
A function f(n) is Ω(g(n)) if there exist constants c > 0 and n₀ ≥ 0 such that:
f(n) ≥ c * g(n) for all n ≥ n₀.
Example:
For f(n) = 3n + 5, it is Ω(n) because 3n + 5 ≥ 3n for all n.
3. Theta Notation (θ) – Tight Bound
Describes the exact (average) time complexity.
It gives both the upper and lower bound.
Definition:
A function f(n) is θ(g(n)) if there exist constants c₁, c₂ > 0 and n₀ ≥ 0 such that:
c₁ * g(n) ≤ f(n) ≤ c₂ * g(n) for all n ≥ n₀.
Example:
For f(n) = 3n + 5, it is θ(n) because it's bounded both above and below by linear
functions.
11(b): Solve the Recurrence Relation
Given:
Using Recursion Tree / Iteration Method
Let’s expand the recurrence:
So
Next:
After k steps,
Exhaustive Search (also called brute force) is a method of solving problems by checking
all possible candidates for the solution and selecting the best one.
Used for optimization and decision problems.
Example Applications: Traveling Salesman Problem (TSP), Knapsack Problem,etc.
Advantages: Guarantees finding the optimal solution.
Disadvantages: Highly inefficient for large inputs due to exponential time complexity.
A Hamiltonian cycle is a path in a graph that visits each vertex exactly once and returns to the
starting vertex.
Algorithm (Backtracking):
Algorithm HamiltonianCycle(Graph G, Path[], pos):
1. if pos == V and there is an edge from Path[pos-1] to Path[0]:
return True
2. for each vertex v in G:
if v is adjacent to Path[pos-1] and v not in Path:
Path[pos] = v
if HamiltonianCycle(G, Path, pos + 1):
return True
Path[pos] = -1 // Backtrack
3. return False
Example:
Backtracking Algorithm:
1. Start from row 0.
2. Try placing a queen in each column of the current row.
3. For each placement, check if it’s safe (no conflicts).
4. If safe, recursively place queen in the next row.
5. If no safe column found, backtrack to the previous row.
15(b): Explain in detail the Maximum Matching in Bipartite Graph with Example
Definition:
A matching in a bipartite graph is a set of edges such that no two edges share a common vertex.
A maximum matching is a matching that contains the largest possible number of edges.
Approach:
Use Hungarian Algorithm (for weighted matching) or Hopcroft–Karp Algorithm (for
unweighted).
In unweighted graphs, a matching is maximum if no augmenting path can be found.
Example:
Let the bipartite graph have:
Set U = {u1, u2, u3}
Set V = {v1, v2, v3}
Edges:
u1 – v1
u1 – v2
u2 – v2
u3 – v3
Applications:
Job assignment
Stable marriages
Network flows
Often solved using Ford–Fulkerson algorithm by converting the bipartite graph into a flow
network.