Master Theorem Revision
Master Theorem Revision
Student’s Name
Institution
Course
Professor
Due Date
2
(a) Simplify the following expressions into a single logarithm (i.e. in the form logab):
ln x
(i) ln y is simplified as logyx
(ii) ln x + ln y
ln(xy)=ln(x)+ln(y)
ln(x)= logex
ln(y)=logey
ln(xy)=logex+logey=loge(xy)
(iii) ln x − ln y
ln x
ln x − ln y = ln y
(iv) 170 ln x
170ln(x)=ln(x170)=loge(x170)
𝑛(𝑛+1)
b) ∑𝑛𝑖=1 𝑖 = 2
3
Let the identity holds for some arbitrary positive integer a. That is,
𝑎(𝑎+1)
∑𝑎𝑖=1 𝑖 =
2
Show that it also holds for a+1:
(𝑎+1)(𝑎+2)
∑𝑎+1
𝑖=1 𝑖 = 2
Simplifying starting from the left hand side:
𝑎(𝑎+1)
∑𝑎+1 𝑎
𝑖=1 𝑖 = ∑𝑖=1 𝑖 + (a+1) = + (a+1)
2
1−𝑟 𝑛+1
𝑎 ( 1−𝑟 ) , 𝑟 ≠ 1
c) ∑𝑛𝑘=0 ar 𝑘
={
𝑎(𝑛 + 1), 𝑟 = 1
To prove the given identity for the geometric series, let's consider two cases: 𝑟 ≠
1 and r=1.
The sum of a geometric series is given by the formula:
1−𝑟 𝑛+1
Sn= 𝑎 ( )
1−𝑟
Verifying the identity:
∑𝑛𝑘=0 ar K = a + ar + ar2 + …+arn
Using the formula for the sum of a geometric series:
1−𝑟 𝑛+1
Sn= 𝑎 ( )
1−𝑟
1−𝑟 𝑘+2
Sk+1= 𝑎 ( )
1−𝑟
1−𝑟 𝑘+2
Sk+1= 𝑎 ( )
1−𝑟
Sk+1 = Sk . r
4
𝑟−𝑟 𝑘+2
Sk+1= 𝑎 ( )
1−𝑟
𝑟−𝑟 𝑘+2
Sk+1= 𝑎 ( )
1−𝑟
1−𝑟 𝑛+1
∑𝑛𝑘=0 ar 𝑘 = 𝑎 ( )
1−𝑟
This proves the identity for the case r ≠ 1
For case 2 when r=1:
When r=I , the sum of the geometric series becomes :
∑𝑛𝑘=0 ar 𝑘 =∑𝑛𝑘=0 a
The sum of n+1 terms of the constant a is a(n+1), therefore:
∑𝑛𝑘=0 𝑎𝑟 𝑘 = a +a +a +……a(n+1 terms)
Simplifies to:
∑nk=0 ar k = a(n+1)
This proves the identity for the case r=1.
So, the identity holds for both cases, and we've successfully proved the formula
for the sum of a geometric series.
5 Recurrence Relations
For each part, find the asymptotic order of growth of T; that is, find a function g such that
T(n) = Θ(g(n)). Show your reasoning and do not directly apply the Master Theorem;
doing so will yield 0 credit.
In all subparts, you may ignore any issues arising from whether a number is an integer.
(a) T(n) = 2T(n/3) + 5n
To find the asymptotic order of growth of the given recurrence relation T(n)=2T(n/3)+5n, the
Master Theorem is used.
The recurrence relation is in the form:
5
𝑛
T(n) = aT (𝑏 )+f(n)
where:
a is the number of sub problems,
b is the factor by which the problem size is reduced,
f(n) is the cost of the work done outside the recursive calls.
The Master Theorem has the following form:
𝛩(𝑛𝑙𝑜𝑔𝑏𝑎 ) 𝑖𝑓 𝑓(𝑛) = 𝑂(𝑛𝑙𝑜𝑔𝑏𝑎−£ ) for some 𝜖 > 0
T(n) = { 𝛩(𝑓(𝑛)) 𝑖𝑓 𝑓(𝑛) = 𝛩(𝑛𝑙𝑜𝑔𝑏𝑎 )
𝛩(𝑓(𝑛)𝑙𝑜𝑔𝑏𝑎 ) 𝑖𝑓 𝑓(𝑛) = Ω(𝑛𝑙𝑜𝑔𝑏𝑎+£ ) 𝑓𝑜𝑟 𝑠𝑜𝑚𝑒 𝜖 > 0
For T(n) = 2T(n/3) + 5n
a=2 (two sub problems),
b=3 (problem size reduced by a factor of 3),
f (n)=5n.
f (n)=Θ(n)=Θ(𝑛𝑙𝑜𝑔170169 )
Since 𝑓(𝑛) = 𝛩(𝑛𝑙𝑜𝑔𝑏𝑎 ) as in case 2 of the master theorem. Therefore, the solution for the
recurrence relation T(n) is :
T(n)=Θ(f(n)logn)=Θ(nlogn)
the asymptotic order of growth for T(n) is Θ(n log n).
(c) An algorithm A takes Θ (n2) time to partition the input into 5 sub-problems of size n/5
each and then recursively runs itself on 3 of those sub problems. Describe the recurrence
Relation for the run-time T (n) of A and find its asymptotic order of growth.
The algorithm A can be described with the following recurrence relation:
𝑛
T(n) = Θ(n2)+ 3T(5)
Here, the Θ(n2) term represents the time taken to partition the input into 5 sub-problems of size
n/5 each.
𝑛
The recursive part 3T (5) represents the algorithm running itself on three of those sub-problems.
𝑛
Each sub-problem has size ( 5), and there are three of them, so we multiply by 3.
Analyze the asymptotic order of growth using the Master Theorem. The recurrence relation is in
the form:
𝑛
T(n) =Θ ( f(n)) + aT (𝑏 )
Simplifying:
3𝑏 4𝑏
T(n)≤c(5𝑏 + 5𝑏 )nb
Let's choose b such that 3b+4b=5b. A reasonable choice is b=1. Therefore, we assume and
conclude T(n)=O(n).
Lower bound:
Let T(n) = Ω(na) for some constant a.
Substituting this into the recurrence relation:
3𝑛 4𝑛 3𝑛 4𝑛
T(n)≥ T( 5 ) + T( 5 ) ≥ c(( 5 )a+( 5 )a)
3𝑎 4𝑎
T(n)≥c(5𝑎 + 5𝑎 )na
8
Let's choose a such that 3a+4a=5a. A reasonable choice is a=2. Therefore, we assume and
conclude T(n)= Ω(n2).
Combining the upper and lower bounds we get:
Since both have T(n)=O(n) and T(n)= Ω(n2) we conclude that T(n)= Θ (n).
Induction process:
Let T(k) = ck for all k <n .
Show T(n) = cn holds.
3𝑛 4𝑛 3𝑛 4𝑛
T(n)= T( 5 ) + T( 5 ) ≤ c( 5 )+( 5 )=cn
6 In Between Functions
In this problem, we will find a function f(n) that is asymptotically worse than polynomial
time but still better than exponential time. In other words, f has to satisfy two properties,
• For all constants k > 0, f(n) = Ω(nk) (1)
• For all constants c > 0, f(n) = O(2cn) (2)
(a) Try setting f(n) to a polynomial of degree d, where d is a very large constant. So
f(n) = a0 + a1n + a2n2· · · + adnd . For which values of k (if any) does f fail to satisfy(1)?
Let f(n)=a0+a1n+a2n2+…+adnd, where d is a very large constant.
Finding for which values of k does f(n)=Ω(nk) fail.
For any positive k, as n becomes large, the term adnd dominates all other terms in the polynomial.
Therefore,f(n) is Ω(nd), but it fails for k<d.
(b) Now try setting f(n) to an, for some constant a > 1 that’s as small as possible while
still satisfying (1) (e.g. 1.000001). For which values of c (if any) does f fail to satisfy
(2)?
Hint: Try rewriting an as 2bn first, where b is a constant dependent on a.
So far we have found that the functions which look like O(nd) for constant d are too
9
small and the functions that look like O(an) are too large even if a is a tiny constant.
Rewriting an as 2(𝑛𝑙𝑜𝑔2 𝑎) .
1 1
f (n)=O(2cn) holds for any c>(𝑙𝑜𝑔 . However It fails when c ≤ (𝑙𝑜𝑔
2 𝑎) 2 𝑎)
(c) Find a function D(n) such that setting f(n) = O(nD(n)) satisfies both (1) and (2). Give a proof
that your answer satisfies both.
Hint: make sure D(n) is asymptotically smaller than n.
Let D(n) = 𝑙𝑜𝑔2 𝑛 .