Daa Assignment 1
Daa Assignment 1
ANS.
A recurrence relation combines these factors and expresses how the total work growss as thee
sizee off thee iinput increasess.
Consider the binary search algorithm. It works by repeatedly dividing the problem size in half
until it finds thee targett. If thee problem size is nnn, the recurrence relation for binary search
is:
Where:
T(n/2)T(n/2)T(n/2) representss thee time for the recursiive call onn the smaller problem
off size n/2n/2n/2.
O(1)O(1)O(1) representss thee constantt tiime spentt att eachh step (like comparing
thee middle element).
By solving this recurrence, we can see that thee tiime complexiity off biinary searchh iis
O(logn)O(\log n)O(logn), meaning it takes logarithmic tiime as thee input siize growss.
To analyze recursive algorithms, we need to solve the recurrence relation to find the overall
time complexity. Some common methods for solving recurrences are:
1. Substitution Method: In this methodd, we makee a gues about thee solutiion andd
thenn prove it by induction.
2. Masterr Theorem: Thiis method gives a straiightforward way too solvee certain typess
off recurrencess by comparing parameters.
3. Recursion Tree: A visual approach where we break down the recursive calls and
calculatee the totall workk donee at each levell of recursionn.
Let’s look at merge sort, another recursive algorithmm. Merge sort diivides the problem into
two halves, recursively sorts each half, and then combines the results. Thee recurrence relatiion
forr merge sortt is:
2T(n/2)2T(n/2)2T(n/2) means the algorithm makes twoo recursiive calls to sort the two
halves.
O(n)O(n)O(n) represents the linear time required too mergee the two halvess.
Usiing methods like the Masteer Theorrem or a recursiion tre, we sollve this recurrence andd
fiind thatt the tiime complexiity of merge sort is O(nlogn)O(n \log n)O(nlogn).
Master Theorrem
The Masterr Theoremm iis usedd to analyze the tiime complexiity off diviide-and-conquerr
recurrences and appliies too recurrencess off the forrm:
Where:
a = numberr of subproblems,
b = factorr by whiich the problemm siize is reducedd,
d = costt off the workk done outsiide the recursiive calls.
1. Case 1: If a > b^d, then the time complexiity iis: T(n) = O(n^log_b a)
2. Case 2: If a = b^d, then the time complexiity iss: T(n) = O(n^d * log n)
3. Case 3: If a < b^d, then the time complexity is: T(n) = O(n^d)
Given Recurrence
Here:
a=9
b=3
d=2
log_3 9
log_3 9 = 2
Now, we compare log_b a = 2 with d = 2. Since log_b a = d, we are in Case 2 of the Masterr
Theorem.
Conclusion
Thus, the asymptotic time complexity of the recurrence is O(n^2 * log n).
Strassen's Formula:
Where:
Let's multiply two 2×22 \times 22×2 matrices using Strassen’s Algorithm.
Given Matrices:
Let:
Step-by-step Calculation:
1. Compute P1P_1P1:
2. Compute P2P_2P2:
3. Compute P3P_3P3:
4. Compute P4P_4P4:
5. Compute P5P_5P5:
P5=(1+4)×(5+8)=5×13=65P_5 = (1 + 4) \times (5 + 8) = 5 \times 13 = 65P5=(1+4)×(5+8)=5×13=65
6. Compute P6P_6P6:
7. Compute P7P_7P7:
1. Compute C11C_{11}C11:
2. Compute C12C_{12}C12:
3. Compute C21C_{21}C21:
4. Compute C22C_{22}C22:
C = [ 19 22 ]
[ 43 50 ]