0% found this document useful (0 votes)
91 views10 pages

Master Theorem Revision

The document discusses algorithms and intractable problems. It covers simplifying logarithmic expressions, proving logarithmic and geometric series identities, analyzing recurrence relations using the master theorem, and finding asymptotic runtimes. Key information includes using the master theorem to determine asymptotic orders of growth for various divide-and-conquer recurrences and proving identities for logarithmic and geometric series sums.

Uploaded by

sultanempire53
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
91 views10 pages

Master Theorem Revision

The document discusses algorithms and intractable problems. It covers simplifying logarithmic expressions, proving logarithmic and geometric series identities, analyzing recurrence relations using the master theorem, and finding asymptotic runtimes. Key information includes using the master theorem to determine asymptotic orders of growth for various divide-and-conquer recurrences and proving identities for logarithmic and geometric series sums.

Uploaded by

sultanempire53
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

1

Efficient Algorithms and Intractable Problems

Student’s Name
Institution
Course
Professor
Due Date
2

(a) Simplify the following expressions into a single logarithm (i.e. in the form logab):

ln x
(i) ln y is simplified as logyx

(ii) ln x + ln y
ln(xy)=ln(x)+ln(y)
ln(x)= logex
ln(y)=logey
ln(xy)=logex+logey=loge(xy)

(iii) ln x − ln y
ln x
ln x − ln y = ln y

This is simplified to logyx

(iv) 170 ln x
170ln(x)=ln(x170)=loge(x170)

b) Give a simple proof for each of the following identities:


1
a) Xlog1/x y= 𝑦
Let a= log1/xy
1
Then (𝑥)a=b
Substitute into the first equation:
Xlog1/xb= xa
1 1
Since (𝑥)a=b, then xa = 𝑏
Therefore the simplified expression is:
1
Xlog1/xb= 𝑏
Thus:
1
Xlog1/x y= 𝑦

𝑛(𝑛+1)
b) ∑𝑛𝑖=1 𝑖 = 2
3

Let the identity holds for some arbitrary positive integer a. That is,
𝑎(𝑎+1)
∑𝑎𝑖=1 𝑖 =
2
Show that it also holds for a+1:
(𝑎+1)(𝑎+2)
∑𝑎+1
𝑖=1 𝑖 = 2
Simplifying starting from the left hand side:
𝑎(𝑎+1)
∑𝑎+1 𝑎
𝑖=1 𝑖 = ∑𝑖=1 𝑖 + (a+1) = + (a+1)
2

Combine the terms with a common denominator:


𝑎(𝑎+1)+(𝑎+1) (𝑎+1)+2(𝑎+1) (𝑎+1)(𝑎+2)
= =
2 2 2

Therefore, the identity holds for all positive integers n.

1−𝑟 𝑛+1
𝑎 ( 1−𝑟 ) , 𝑟 ≠ 1
c) ∑𝑛𝑘=0 ar 𝑘
={
𝑎(𝑛 + 1), 𝑟 = 1
To prove the given identity for the geometric series, let's consider two cases: 𝑟 ≠
1 and r=1.
The sum of a geometric series is given by the formula:
1−𝑟 𝑛+1
Sn= 𝑎 ( )
1−𝑟
Verifying the identity:
∑𝑛𝑘=0 ar K = a + ar + ar2 + …+arn
Using the formula for the sum of a geometric series:
1−𝑟 𝑛+1
Sn= 𝑎 ( )
1−𝑟

Let the formula hold for n=k:


1−𝑟 𝑘+1
Sk= 𝑎 ( )
1−𝑟

Let’s prove it for n = k+1


1−𝑟 (𝑘+1)+1
Sk+1= 𝑎 ( )
1−𝑟

1−𝑟 𝑘+2
Sk+1= 𝑎 ( )
1−𝑟

1−𝑟 𝑘+2
Sk+1= 𝑎 ( )
1−𝑟

Expressing Sk+1 in terms of Sk


1−𝑟 𝑘+2
Sk+1= 𝑎 ( )
1−𝑟

Sk+1 = Sk . r
4

Substituting the formula for Sk:


1−𝑟 𝑘+1
Sk+1= 𝑎. ( ).r
1−𝑟

𝑟−𝑟 𝑘+2
Sk+1= 𝑎 ( )
1−𝑟

𝑟−𝑟 𝑘+2
Sk+1= 𝑎 ( )
1−𝑟

Thus, the formula holds for n=k+1.


This proves the identity holds when 𝑟 ≠ 1

Now, substitute Sn back into the expression:

1−𝑟 𝑛+1
∑𝑛𝑘=0 ar 𝑘 = 𝑎 ( )
1−𝑟
This proves the identity for the case r ≠ 1
For case 2 when r=1:
When r=I , the sum of the geometric series becomes :
∑𝑛𝑘=0 ar 𝑘 =∑𝑛𝑘=0 a
The sum of n+1 terms of the constant a is a(n+1), therefore:
∑𝑛𝑘=0 𝑎𝑟 𝑘 = a +a +a +……a(n+1 terms)
Simplifies to:
∑nk=0 ar k = a(n+1)
This proves the identity for the case r=1.
So, the identity holds for both cases, and we've successfully proved the formula
for the sum of a geometric series.

5 Recurrence Relations
For each part, find the asymptotic order of growth of T; that is, find a function g such that
T(n) = Θ(g(n)). Show your reasoning and do not directly apply the Master Theorem;
doing so will yield 0 credit.
In all subparts, you may ignore any issues arising from whether a number is an integer.
(a) T(n) = 2T(n/3) + 5n
To find the asymptotic order of growth of the given recurrence relation T(n)=2T(n/3)+5n, the
Master Theorem is used.
The recurrence relation is in the form:
5

𝑛
T(n) = aT (𝑏 )+f(n)

where:
 a is the number of sub problems,
 b is the factor by which the problem size is reduced,
 f(n) is the cost of the work done outside the recursive calls.
The Master Theorem has the following form:
𝛩(𝑛𝑙𝑜𝑔𝑏𝑎 ) 𝑖𝑓 𝑓(𝑛) = 𝑂(𝑛𝑙𝑜𝑔𝑏𝑎−£ ) for some 𝜖 > 0
T(n) = { 𝛩(𝑓(𝑛)) 𝑖𝑓 𝑓(𝑛) = 𝛩(𝑛𝑙𝑜𝑔𝑏𝑎 )
𝛩(𝑓(𝑛)𝑙𝑜𝑔𝑏𝑎 ) 𝑖𝑓 𝑓(𝑛) = Ω(𝑛𝑙𝑜𝑔𝑏𝑎+£ ) 𝑓𝑜𝑟 𝑠𝑜𝑚𝑒 𝜖 > 0
For T(n) = 2T(n/3) + 5n
 a=2 (two sub problems),
 b=3 (problem size reduced by a factor of 3),
 f (n)=5n.

Comparing f(n) with nlogba ,


f(n)= 5n = 𝑂(𝑛𝑙𝑜𝑔3 2−£ )
Since f(n) is 𝑂(𝑛𝑙𝑜𝑔𝑏𝑎−£ ) for some 𝜖 > 0 , that is in case 1 of the master theorem.
Therefore, the solution for the recurrence relation T(n) is:
T(n) = 𝛩(𝑛𝑙𝑜𝑔𝑏𝑎 ) = Θ (𝑛𝑙𝑜𝑔3 2 )
So, the asymptotic order of growth for T(n) is 𝑛𝑙𝑜𝑔3 2

(b) T (n) = 169T (n/170) + Θ (n)


The recurrence relation is in the form:
𝑛
T(n) = aT (𝑏 )+f(n)

Where a= 169, b = 170, and f(n) = Θ (n)


Substituting in 𝑙𝑜𝑔𝑏 𝑎:
𝑙𝑜𝑔170 169 ≈ 0.9995
Compare f (n) with 𝑛𝑙𝑜𝑔𝑏𝑎 :
6

f (n)=Θ(n)=Θ(𝑛𝑙𝑜𝑔170169 )
Since 𝑓(𝑛) = 𝛩(𝑛𝑙𝑜𝑔𝑏𝑎 ) as in case 2 of the master theorem. Therefore, the solution for the
recurrence relation T(n) is :
T(n)=Θ(f(n)logn)=Θ(nlogn)
the asymptotic order of growth for T(n) is Θ(n log n).

(c) An algorithm A takes Θ (n2) time to partition the input into 5 sub-problems of size n/5
each and then recursively runs itself on 3 of those sub problems. Describe the recurrence
Relation for the run-time T (n) of A and find its asymptotic order of growth.
The algorithm A can be described with the following recurrence relation:
𝑛
T(n) = Θ(n2)+ 3T(5)

Here, the Θ(n2) term represents the time taken to partition the input into 5 sub-problems of size
n/5 each.
𝑛
The recursive part 3T (5) represents the algorithm running itself on three of those sub-problems.
𝑛
Each sub-problem has size ( 5), and there are three of them, so we multiply by 3.

Analyze the asymptotic order of growth using the Master Theorem. The recurrence relation is in
the form:
𝑛
T(n) =Θ ( f(n)) + aT (𝑏 )

Where f (n) =n2, a=3, and b=5.


Substituting into 𝑙𝑜𝑔𝑏 𝑎 :
𝑙𝑜𝑔5 3
Compare f(n) with 𝑛𝑙𝑜𝑔𝑏𝑎 :
n2= O ( 𝑛𝑙𝑜𝑔5 3−£ ):
Since f(n) is O ( 𝑛𝑙𝑜𝑔𝑏𝑎−£ ) for some 𝜖 > 0 , as in the case 1 of the master theorem.
Therefore the solution for the recurrence relation T(n) is :
T(n) = Θ( 𝑛𝑙𝑜𝑔𝑏𝑎 ) = Θ ( 𝑛𝑙𝑜𝑔5 3 ):
T(n) = Θ( f(n)log n) = Θ(n2 log n)
The asymptotic order of growth for the run time of algorithm A is Θ (n2log n ).
7

(d) T(n) = 6T(n/6) + Θ(n)


The recurrence relation is in the form:
𝑛
T(n) = aT (𝑏 )+f(n)

Where a=6, b=6, and f(n)=Θ(n)


Comparing f(n) with 𝑛𝑙𝑜𝑔𝑏𝑎 :
f (n)=Θ(n)=Θ(𝑛𝑙𝑜𝑔6 6 )
Since 𝑓(𝑛) = 𝛩(𝑛𝑙𝑜𝑔𝑏𝑎 ) as in case 2 of the master theorem. Therefore, the solution for the
recurrence relation T(n) is :
T(n)=Θ(f(n)logn)=Θ(nlogn)
The asymptotic order of growth for T (n) is Θ(n log n).
(e) T(n) = T(3n/5) + T(4n/5) (We have T(1) = 1)
Hint: first, compute a reasonable upper and lower bound for T(n). Then, try to guess
a T(n) of the form anb and then use induction to argue that it is correct.
Using upper and lower bounds for T(n):
Upper bound:
A solution of T(n) = O(nb), where b is a constant.
Substituting this into the recurrence relation we get:
3𝑛 4𝑛 3𝑛 4𝑛
T(n)≤ T( 5 ) + T( 5 ) ≤ c(( 5 )b+( 5 )b)

Simplifying:
3𝑏 4𝑏
T(n)≤c(5𝑏 + 5𝑏 )nb

Let's choose b such that 3b+4b=5b. A reasonable choice is b=1. Therefore, we assume and
conclude T(n)=O(n).
Lower bound:
Let T(n) = Ω(na) for some constant a.
Substituting this into the recurrence relation:
3𝑛 4𝑛 3𝑛 4𝑛
T(n)≥ T( 5 ) + T( 5 ) ≥ c(( 5 )a+( 5 )a)
3𝑎 4𝑎
T(n)≥c(5𝑎 + 5𝑎 )na
8

Let's choose a such that 3a+4a=5a. A reasonable choice is a=2. Therefore, we assume and
conclude T(n)= Ω(n2).
Combining the upper and lower bounds we get:
Since both have T(n)=O(n) and T(n)= Ω(n2) we conclude that T(n)= Θ (n).
Induction process:
Let T(k) = ck for all k <n .
Show T(n) = cn holds.
3𝑛 4𝑛 3𝑛 4𝑛
T(n)= T( 5 ) + T( 5 ) ≤ c( 5 )+( 5 )=cn

The base case T (1) =1 is satisfied by choosing c=1


Therefore, by induction, T(n) = Θ(n)

6 In Between Functions
In this problem, we will find a function f(n) that is asymptotically worse than polynomial
time but still better than exponential time. In other words, f has to satisfy two properties,
• For all constants k > 0, f(n) = Ω(nk) (1)
• For all constants c > 0, f(n) = O(2cn) (2)
(a) Try setting f(n) to a polynomial of degree d, where d is a very large constant. So
f(n) = a0 + a1n + a2n2· · · + adnd . For which values of k (if any) does f fail to satisfy(1)?
Let f(n)=a0+a1n+a2n2+…+adnd, where d is a very large constant.
Finding for which values of k does f(n)=Ω(nk) fail.
For any positive k, as n becomes large, the term adnd dominates all other terms in the polynomial.
Therefore,f(n) is Ω(nd), but it fails for k<d.

(b) Now try setting f(n) to an, for some constant a > 1 that’s as small as possible while
still satisfying (1) (e.g. 1.000001). For which values of c (if any) does f fail to satisfy
(2)?
Hint: Try rewriting an as 2bn first, where b is a constant dependent on a.
So far we have found that the functions which look like O(nd) for constant d are too
9

small and the functions that look like O(an) are too large even if a is a tiny constant.

Let f(n)=an for some constant a>1.


Finding for which values of c does f(n)=O(2cn) fail.

Rewriting an as 2(𝑛𝑙𝑜𝑔2 𝑎) .
1 1
f (n)=O(2cn) holds for any c>(𝑙𝑜𝑔 . However It fails when c ≤ (𝑙𝑜𝑔
2 𝑎) 2 𝑎)

(c) Find a function D(n) such that setting f(n) = O(nD(n)) satisfies both (1) and (2). Give a proof
that your answer satisfies both.
Hint: make sure D(n) is asymptotically smaller than n.
Let D(n) = 𝑙𝑜𝑔2 𝑛 .

Let f(n) = 2(𝑙𝑜𝑔2 𝑛) = n


Proof:
1. Satisfying (1):
For all constants k>0, f(n)=n=Ω(nk) because nk grows slower than n for any k>0
2. Satisfying (2):
For all constants c>0, f(n) = 2(𝑙𝑜𝑔2 𝑛) = n is O(2cn) since the growth rate of n is linear.
Thus f(n)= O(nD(n)) with D(n) = 𝑙𝑜𝑔2 𝑛 satisfies both conditions.
10

You might also like