0% found this document useful (0 votes)
2 views

Master-Theorem-for-Analyzing-Algorithm-EfficiencyShubh

Uploaded by

Shubh Agarwal
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Master-Theorem-for-Analyzing-Algorithm-EfficiencyShubh

Uploaded by

Shubh Agarwal
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

TOPIC: Master Theorem

SUBJECT: Design And Analysis of Algorithms


Submitted By: Shubh Agarwal
Q-ID: 22030295
SubmittedTo:
Mr.Sagar Chaudhary
M a s t e r Th e o r e m f o r
A nalyzing Algorithm
Efficiency
Master Theorem is a powerful tool for analyzing the
time complexity of recursive algorithms, particularly
those following the "Divide and Conquer" paradigm. It
provides a simple way to determine the asymptotic
growth rate of such algorithms.
W h a t is M a s t e r Th e o r e m ?
Master Theorem is a mathematical formula It helps determine the time complexity of
used to solve recurrence relations that arise in recursive algorithms based on the relationships
the analysis of divide-and- conquer algorithms. between the size of the problem, the number of
It provides a way to express the time subproblems, and the work done in combining
complexity of a recursive algorithm in terms of the subproblem solutions.
its parameters.
Divide and C o n q u e r Al g o r i t h m s

Divide
1 Break down a problem into smaller subproblems of the
same type.

Conquer
2 Solve the subproblems recursively until they are simple enough to
solve directly.

Combine
3 Combine the solutions to the subproblems to
produce a solution to the original problem.
Recurrence
Relations
T(n)
1 Represents the total time taken to solve a
problem of size n.

aT(n/b)
2
Time taken to solve a subproblem of
size n/b.

f(n)
3 Time taken to divide the problem and combine
subproblem solutions.
T h r e e C a s e s of M a s t e r
Th e o r e m
C a s e 1: W o r k Do m i n a t e s

The cost of combining subproblem solutions (f(n)) grows


significantly faster than the cost of solving subproblems
(aT(n/b)). This often means that the work done outside of
the recursion dominates the overall runtime.

C a s e 2: Balanced C o s t s

The cost of dividing the problem and combining solutions


(f(n)) grows at a similar rate to the cost of solving
subproblems (aT(n/b)). In this case, the costs are balanced,
and the overall runtime is a combination of both.

C a s e 3: R e c u r s i o n Do m i n a t e s

The cost of solving subproblems (aT(n/b)) grows


significantly faster than the cost of dividing and
combining (f(n)). Here, the recursive calls dominate the
overall runtime.
C a s e 1: a>b ^ c
In this case, the time complexity is dominated by the cost of
combining the solutions of the subproblems, represented by
the function f(n). This can be visualized as a puzzle where
assembling the final pieces takes significantly longer than
dividing the puzzle into sections or solving individual parts.

T(n) = Θ(nc)
C a s e 2: Balanced
Costs
1
T(n) = Θ(n^c l o g n)
Here, the time it takes to solve the problem is roughly n^c
log n. This happens when the effort to split the problem
and put the solutions back together is similar to the effort
to solve the smaller parts. Think of merge sort: you split the
list in half, solve each half, then merge them back together.
The merging takes about as much time as sorting the halves,
resulting in a total time of n log n (because c=1 in this case).
C a s e 3: a <b ^ c
In this case, the time complexity is dominated by the cost of
recursive calls, represented by aT(n/b). This occurs when the
work done outside the recursion is relatively insignificant
compared to the work within the recursive calls. An example
of this is binary search, where the problem is repeatedly
halved until a solution is found, with minimal work done
outside
the recursive splitting.

T(n) = Θ(nlogb(a))
Example of M a s t e r
Theorem:-
• Consider the Fibonacci sequence algorithm, which calculates the nth Fibonacci number recursively.
• The recurrence relation for Fibonacci is: T(n) = T(n-1) + T(n-2) + Θ(1), where the Θ(1) term
represents the constant time to combine the results.

• Applying the Master Theorem, we have: a = 2, b = 2, c = 1. Since a = b^c, this falls under Case 2, giving
a time complexity of Θ(nlog n).
• The Master Theorem provides an efficient and accurate way to determine the time complexity of
divide-and-conquer algorithms like Fibonacci, without having to solve complex recurrence
relations.

• This example showcases the power and versatility of the Master Theorem in algorithmic analysis.
Advantages o f
Master
Th e o r e m
1 Simplicity 2 Accuracy
Master theorem It gives accurate and
provides a simple way tight bounds on the time
to analyze the time complexity of algorithms,
complexity of divide- allowing for precise
and-conquer analysis of performance.
algorithms without
having to solve
recurrence relations.
3 Efficiency
By using the theorem, you can quickly determine the
time complexity without needing to go through a
lengthy and complex analysis.
C o n c l u s i o n And K e y
T a keaways
The Master Theorem stands as a powerful tool for analyzing
the time complexity of divide-and-conquer algorithms. It
offers a simple, accurate, and efficient method to determine
the asymptotic growth rate of these algorithms. By
understanding the three cases of the Master Theorem, you
can swiftly assess the time complexity of various algorithms,
enabling informed decisions about their efficiency and
suitability for specific applications.
THANK YOU

You might also like