0% found this document useful (0 votes)
10 views32 pages

7-Recurrences, Back Substitution Method-04!05!2023

The document discusses time complexity in recursive algorithms, focusing on solving recurrences to determine running times. It provides examples of common recurrences and their solutions, including methods such as substitution, iteration, recursion trees, and the Master method. Additionally, it analyzes the binary search algorithm and illustrates the application of these methods to various recurrence relations.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views32 pages

7-Recurrences, Back Substitution Method-04!05!2023

The document discusses time complexity in recursive algorithms, focusing on solving recurrences to determine running times. It provides examples of common recurrences and their solutions, including methods such as substitution, iteration, recursion trees, and the Master method. Additionally, it analyzes the binary search algorithm and illustrates the application of these methods to various recurrence relations.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 32

Time Complexity

Recursive algorithms
Recurrences and Running Time
• An equation or inequality that describes a function in terms of
its value on smaller inputs.
T(n) = T(n-1) + n
• Recurrences arise when an algorithm contains recursive calls
to itself

• What is the actual running time of the algorithm?


• Need to solve the recurrence
– Find an explicit formula of the expression
– Bound the recurrence by an expression that involves n

2
Example Recurrences
• T(n) = T(n-1) + n Θ(n2)
– Recursive algorithm that loops through the input to
eliminate one item
• T(n) = T(n/2) + c Θ(logn)
– Recursive algorithm that halves the input in one step
• T(n) = T(n/2) + n Θ(n log n)
– Recursive algorithm that halves the input but must
examine every item in the input
• T(n) = 2T(n/2) + 1 Θ(n)
– Recursive algorithm that splits the input into 2 halves
and does a constant amount of other work

3
Recurrent Algorithms
BINARY-SEARCH
• for an ordered array A, finds if x is in the array A[lo…hi]

Alg.: BINARY-SEARCH (A, lo, hi, x)


1 2 3 4 5 6 7 8

if (lo > hi) 2 3 5 7 9 10 11 12


return FALSE
mid  (lo+hi)/2 mid
lo hi
if x = A[mid]
return TRUE
if ( x < A[mid] )
BINARY-SEARCH (A, lo, mid-1, x)
if ( x > A[mid] )
BINARY-SEARCH (A, mid+1, hi, x)
4
Example
• A[8] = {1, 2, 3, 4, 5, 7, 9, 11}
– lo = 1 hi = 8 x = 7
1 2 3 4 5 6 7 8

1 2 3 4 5 7 9 11 mid = 4, lo = 5, hi = 8

5 6 7 8

1 2 3 4 5 7 9 11 mid = 6, A[mid] = x
Found!

5
Another Example
• A[8] = {1, 2, 3, 4, 5, 7, 9, 11}
– lo = 1 hi = 8 x=6
1 2 3 4 5 6 7 8

1 2 3 4 5 7 9 11 mid = 4, lo = 5, hi = 8
low high

1 2 3 4 5 7 9 11 mid = 6, A[6] = 7, lo = 5, hi = 5
low high
1 2 3 4 5 7 9 11 mid = 5, A[5] = 5, lo = 6, hi = 5
NOT FOUND!

1 2 3 4 5 7 9 11
high low

6
Analysis of BINARY-SEARCH
Alg.: BINARY-SEARCH (A, lo, hi, x)
if (lo > hi) constant time: c1
return FALSE
mid  (lo+hi)/2 constant time: c2
if x = A[mid] constant time: c3
return TRUE
if ( x < A[mid] )
BINARY-SEARCH (A, lo, mid-1, x) same problem of size n/2
if ( x > A[mid] )
BINARY-SEARCH (A, mid+1, hi, x) same problem of size n/2

• T(n) = c +T(n/2)
– T(n) – running time for an array of size n
7
Methods for Solving Recurrences

• Iteration method

• Substitution method

• Recursion tree method

• Master method

8
The Back Substitution Method
• Convert the recurrence into a summation and
try to bound it using known series
– Iterate the recurrence until the initial condition is
reached.
– Use back-substitution to express the recurrence in
terms of n and the initial (boundary) condition.

9
T(n) = c + T(n/2);n>1, T(1);n=1
T(n) = c + T(n/2) T(n/2) = c + T(n/4)
= c + c + T(n/4)T(n/4) = c + T(n/8)
= c + c + c + T(n/8)
Assume n = 2k
T(n) = c + c + … + c + T(1) k = log n
k times
= clogn + T(1)
= O(logn)

10
T(n) = n + 2T(n/2) Assume: n =
2k
T(n) = n + 2T(n/2) T(n/2) = n/2 + 2T(n/4)
= n + 2(n/2 + 2T(n/4))
= n + n + 4T(n/4)
= n + n + 4(n/4 + 2T(n/8))
= n + n + n + 8T(n/8)
… = in + 2iT(n/2i)
= kn + 2kT(1)
= nlogn + nT(1) = O(nlogn)
11
Practice
• T(n) = 1+T(n-1) O(n)
• T(n) = n+T(n-1) O(n 2)

12
The substitution method

1. Guess a solution

2. Use induction to prove that the


solution works

13
Substitution method
• Guess a solution
– T(n) = O(g(n))
– Induction goal: apply the definition of the asymptotic notation

• T(n) ≤ c g(n), for some c > 0 and n ≥(strong


n0 induction)
– Induction hypothesis: T(k) ≤ c g(k) for all k < n

• Prove the induction goal


– Use the induction hypothesis to find some values of the constants c
and n0 for which the induction goal holds

14
Example: Binary Search
T(n) = c + T(n/2)
• Guess: T(n) = O(logn)
– Induction goal: T(n) ≤ c logn, for some c and n ≥ n0
– Induction hypothesis: T(n/2) ≤ c log(n/2)
• Proof of induction goal:
T(n) = T(n/2) + c ≤ c log(n/2) + c
= c logn – c log2 + c ≤ c log n
if: – c + c≤ 0, c≥ k
• Base case? 15
Example 2
T(n) = T(n-1) + n
• Guess: T(n) = O(n2)
– Induction goal: T(n) ≤ c n2, for some c and n ≥ n0
– Induction hypothesis: T(n-1) ≤ c(n-1)2 for all k < n

• Proof of induction goal:


T(n) = T(n-1) + n ≤ c (n-1)2 + n =c(n2-2n+1)+n =cn2-
2cn+c+n = cn2 – (2cn – c - n) ≤ cn2
if: 2cn – c – n ≥ 0  c ≥ n/(2n-1)  c ≥ 1/(2
– 1/n)
– For n ≥ 1  2 – 1/n ≥ 1  any c ≥ 1 will work
16
Example 3
T(n) = 2T(n/2) + n
• Guess: T(n) = O(nlogn)
– Induction goal: T(n) ≤ cn logn, for some c and n ≥ n0
– Induction hypothesis: T(n/2) ≤ cn/2 log(n/2)
• Proof of induction goal:
T(n) = 2T(n/2) + n ≤ 2c (n/2)log(n/2) + n
= cn logn – cnlog2 + n ≤ cn logn
if: - cn + n ≤ 0  c ≥ 1
• Base case? 17
Practice
• T(n) = 3T(n/4) + cn2 O(n2)

18
The recursion-tree method

Convert the recurrence into a tree:


– Each node represents the cost incurred at various levels
of recursion
– Sum up the costs of all levels

Used to “guess” a solution for the recurrence

19
Example 1
W(n) = 2W(n/2) + n2

• Subproblem size at level i is: n/2i


• Subproblem size hits 1 when 1 = n/2i  i = logn
• (log n +1) levels . n2
• W(n) = O(n2 )

20
T(n)=3 T(n/4)2 +cn2
E.g.: T(n) = 3T(n/4) + cn2

• Subproblem size at level i is: n/4i


• Subproblem size hits 1 when 1 = n/4i  i = log4n
• Cost of a node at level i = c(n/4i)2
• Number of nodes at level i = 3i  last level has 3log4n = nlog43 nodes
• Total cost:

 T(n) = O(n2)
21
Example 2 - Substitution
T(n) = 3T(n/4) + cn2
• Guess: T(n) = O(n2)
– Induction goal: T(n) ≤ dn2, for some d and n ≥ n0
– Induction hypothesis: T(n/4) ≤ d (n/4)2
• Proof of induction goal:
T(n) = 3T(n/4) + cn2
≤ 3d (n/4)2 + cn2
= (3/16) d n2 + cn2
≤ d n2 if: d ≥ (16/13)c

• Therefore: T(n) = O(n2)

22
Example 3 (simpler proof)
W(n) = W(n/3) + W(2n/3) +
n
• The longest path from the root to
a leaf is:
n  (2/3)n  (2/3)2 n  …
1
• Subproblem size hits 1 when
1 = (2/3)in  i=log3/2n

• Cost of the problem at level i = n


• Total W
cost: lg n
(n)  n  n  ... n(log 3/ 2 n) n O (n lg n)
3
lg
2
23
Example 3
W(n) = W(n/3) + W(2n/3) +
n
• The longest path from the root to
a leaf is:
n  (2/3)n  (2/3)2 n  …
1
• Subproblem size hits 1 when
1 = (2/3)in  i=log3/2n

• Cost of the problem at level i = n(log n ) 1 3/ 2

W (n)  n  n  ...   n  2(log 3/ 2 n)


W (1) 
• Total cost: i 0
log3 / 2 n
lg n 1
n 
i 0
1  nlog3 / 2 2 n log 3/ 2 n  O (n) n
lg 3 / 2
 O (n) 
lg 3 / 2
n lg n  O (n)

24
Example 3 - Substitution
W(n) = W(n/3) + W(2n/3) +
O(n)
• Guess: W(n) = O(nlgn)
– Induction goal: W(n) ≤ dnlgn, for some d and n
≥ n0
– Induction hypothesis: W(k) ≤ d klgk for any K
< n (n/3, 2n/3)
• Proof of induction goal:
Try it out as an exercise!!
• T(n) = O(nlgn) 25
Master’s method
• “Cookbook” for solving recurrences of the form:
 n
T (n) aT    f (n)
b
where, a ≥ 1, b > 1, and f(n) > 0

Idea: compare f(n) with nlogba

• f(n) is asymptotically smaller or larger than nlogba by a

polynomial factor n

• f(n) is asymptotically equal with nlogba 26


Master’s method
• “Cookbook” for solving recurrences of the form:

 n
T (n) aT    f (n)
b
where, a ≥ 1, b > 1, and f(n) > 0

Case 1: if f(n) = O(nlogba -) for some  > 0, then: T(n) = (nlogba)

Case 2: if f(n) = (nlogba), then: T(n) = (nlogba lgn)

Case 3: if f(n) = (nlogba +) for some  > 0, and if

af(n/b) ≤ cf(n) for some c < 1 and all sufficiently large n, then:

regularity condition T(n) = (f(n))


27
Examples

T(n) = 2T(n/2) + n

a = 2, b = 2, log22 = 1

Compare nlog22 with f(n) = n

 f(n) = (n)  Case 2

 T(n) = (nlgn)
29
Examples
T(n) = 2T(n/2) + n2
a = 2, b = 2, log22 = 1
Compare n with f(n) = n2
 f(n) = (n1+) Case 3  verify regularity cond.

a f(n/b) ≤ c f(n)
 2 n2/4 ≤ c n2  c = ½ is a solution
(c<1)
 T(n) = (n2)
30
Examples (cont.)

T(n) = 2T(n/2) + n

a = 2, b = 2, log22 = 1

Compare n with f(n) = n1/2

 f(n) = O(n1-) Case 1

 T(n) = (n)
31
Examples
T(n) = 3T(n/4) + nlgn
a = 3, b = 4, log43 = 0.793
Compare n0.793 with f(n) = nlgn
f(n) = (nlog43+) Case 3
Check regularity condition:
3(n/4)lg(n/4) ≤ (3/4)nlgn = c f(n),
c=3/4
T(n) = (nlgn) 32
Examples
T(n) = 2T(n/2) + nlgn

a = 2, b = 2, log22 = 1

• Compare n with f(n) = nlgn


– seems like case 3 should apply

• f(n) must be polynomially larger by a factor of n 

• In this case it is only larger by a factor of lgn

33

You might also like