0% found this document useful (0 votes)
89 views43 pages

Advanced Algorithms: Asymptotic Notation Merge Sort Solving Recurrence

The document discusses algorithms and asymptotic notation. It begins by reviewing asymptotic notation such as Big-O, Ω, and Θ. It then discusses solving recurrences using methods like substitution and iteration. As an example, it analyzes the recurrence for merge sort, showing that the runtime is Θ(n log n). It provides additional examples of solving recurrences using iteration, including recurrences of the form T(n) = aT(n/b) + f(n).
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
89 views43 pages

Advanced Algorithms: Asymptotic Notation Merge Sort Solving Recurrence

The document discusses algorithms and asymptotic notation. It begins by reviewing asymptotic notation such as Big-O, Ω, and Θ. It then discusses solving recurrences using methods like substitution and iteration. As an example, it analyzes the recurrence for merge sort, showing that the runtime is Θ(n log n). It provides additional examples of solving recurrences using iteration, including recurrences of the form T(n) = aT(n/b) + f(n).
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 43

Advanced Algorithms

1 Lecture

Asymptotic Notation
Merge Sort
Solving Recurrence
Review: Asymptotic Notation

Upper Bound Notation:


f(n) is O(g(n)) if there exist positive constants c and n0
such that f(n) c g(n) for all n n0
Formally, O(g(n)) = { f(n): positive constants c and n0
such that f(n) c g(n) n n0

Big O fact:
A polynomial of degree k is O(nk)
Review: Asymptotic Notation

Asymptotic lower bound:


f(n) is (g(n)) if positive constants c and n0 such that
0 cg(n) f(n) n n0

Asymptotic tight bound:


f(n) is (g(n)) if positive constants c1, c2, and n0 such
that c1 g(n) f(n) c2 g(n) n n0
f(n) = (g(n)) if and only if
f(n) = O(g(n)) AND f(n) = (g(n))
Other Asymptotic Notations

A function f(n) is o(g(n)) if positive constants c and n0


such that
f(n) < c g(n) n n0

A function f(n) is (g(n)) if positive constants c and n0


such that
c g(n) < f(n) n n0

Intuitively,

o() is like < () is like > () is like =


O() is like () is like
Merge Sort
Merge Sort

MergeSort(A, left, right) {


if (left < right) {
mid = floor((left + right) / 2);
MergeSort(A, left, mid);
MergeSort(A, mid+1, right);
Merge(A, left, mid, right);
}
}

// Merge() takes two sorted subarrays of A and


// merges them into a single sorted subarray of A
// (how long should this take?)
Merge Sort: Example

Show MergeSort() running on the array

A = {10, 5, 7, 6, 1, 4, 8, 3, 2, 9};
Analysis of Merge Sort

Statement
MergeSort(A, left, right) {
Effort
T(n)
if (left < right) { (1)
mid = floor((left + right) / 2); (1)
MergeSort(A, left, mid); T(n/2)
MergeSort(A, mid+1, right); T(n/2)
Merge(A, left, mid, right); (n)
}
}

So T(n) = (1) when n = 1, and


2T(n/2) + (n) when n > 1

So what (more succinctly) is T(n)?


Recurrences
Recurrences

The expression:

c n 1

T ( n)
2T n cn n 1
2
is a recurrence.
Recurrence: an equation that describes a function in
terms of its value on smaller functions
Recurrence Examples

0 n0 0 n0
s ( n) s ( n)
c s (n 1) n 0 n s (n 1) n 0

n 1
c c n 1

T ( n) T ( n)
2T n c n 1 n
2 aT cn n 1
b
Solving Recurrences

Substitution method

Iteration method

Master method
Solving Recurrences

The substitution method (CLR 4.1)


A.k.a. the making a good guess method
Guess the form of the answer, then use induction to
find the constants and show that solution works
Examples:

T(n) = 2T(n/2) + (n) T(n) = (n lg n)

T(n) = 2T( n/2 ) + n ???
Solving Recurrences

The substitution method (CLR 4.1)


A.k.a. the making a good guess method
Guess the form of the answer, then use induction to
find the constants and show that solution works
Examples:

T(n) = 2T(n/2) + (n) T(n) = (n lg n)

T(n) = 2T( n/2 ) + n T(n) = (n lg n)

T(n) = 2T( n/2 )+ 17) + n ???
Solving Recurrences

The substitution method (CLR 4.1)


A.k.a. the making a good guess method
Guess the form of the answer, then use induction to
find the constants and show that solution works
Examples:

T(n) = 2T(n/2) + (n) T(n) = (n lg n)

T(n) = 2T( n/2 ) + n T(n) = (n lg n)

T(n) = 2T( n/2 + 17) + n (n lg n)
Solving Recurrences

Another option is what the book calls the iteration


method
Expand the recurrence
Work some algebra to express as a summation
Evaluate the summation

We will show several examples


0 n0
s ( n)
c s (n 1) n 0
s(n) =
c + s(n-1)
c + c + s(n-2)
2c + s(n-2)
2c + c + s(n-3)
3c + s(n-3)

kc + s(n-k) = ck + s(n-k)
0 n0
s ( n)
c s (n 1) n 0
So far for n >= k we have
s(n) = ck + s(n-k)
What if k = n?
s(n) = cn + s(0) = cn
0 n0
s ( n)
c s (n 1) n 0
So far for n >= k we have
s(n) = ck + s(n-k)
What if k = n?
s(n) = cn + s(0) = cn
So 0 n0
s ( n)
c s ( n 1) n0
Thus in general
s(n) = cn
0 n0
s ( n)
n s (n 1) n 0
s(n)
= n + s(n-1)
= n + n-1 + s(n-2)
= n + n-1 + n-2 + s(n-3)
= n + n-1 + n-2 + n-3 + s(n-4)
=
= n + n-1 + n-2 + n-3 + + n-(k-1) + s(n-k)
0 n0
s ( n)
n s (n 1) n 0
s(n)
= n + s(n-1)
= n + n-1 + s(n-2)
= n + n-1 + n-2 + s(n-3)
= n + n-1 + n-2 + n-3 + s(n-4)
=
= n + n-1 + n-2 + n-3 + + n-(k-1) + s(n-k)

i
i n k 1
s(n k )
0 n0
s ( n)
n s (n 1) n 0
So far for n >= k we have
n

i
i n k 1
s(n k )
0 n0
s ( n)
n s (n 1) n 0
So far for n >= k we have
n

i
i n k 1
s(n k )
What if k = n?
0 n0
s ( n)
n s (n 1) n 0
So far
n
for n >= k we have
i
i n k 1
s(n k )
What if k = n?
n
n 1
n


i 1
i s ( 0) i 0 n
i 1 2
0 n0
s ( n)
n s (n 1) n 0
So far
n
for n >= k we have
i
i n k 1
s(n k )
What if k = n?
n
n 1
n


i 1
i s ( 0) i 0 n
i 1 2
Thus in general
n 1
s ( n) n
2
c n 1

T (n) aT n
cn n 1
b
T(n) =
aT(n/b) + cn
a(aT(n/b/b) + cn/b) + cn
a2T(n/b2) + cna/b + cn
a2T(n/b2) + cn(a/b + 1)
a2(aT(n/b2/b) + cn/b2) + cn(a/b + 1)
a3T(n/b3) + cn(a2/b2) + cn(a/b + 1)
a3T(n/b3) + cn(a2/b2 + a/b + 1)

akT(n/bk) + cn(ak-1/bk-1 + ak-2/bk-2 + + a2/b2 + a/b + 1)
c n 1

T (n) aT n
cn n 1
b
So we have
T(n) = akT(n/bk) + cn(ak-1/bk-1 + ... + a2/b2 + a/b + 1)
For k = logb n
n = bk
T(n) = akT(1) + cn(ak-1/bk-1 + ... + a2/b2 + a/b + 1)
= akc + cn(ak-1/bk-1 + ... + a2/b2 + a/b + 1)
= cak + cn(ak-1/bk-1 + ... + a2/b2 + a/b + 1)
= cnak /bk + cn(ak-1/bk-1 + ... + a2/b2 + a/b + 1)
= cn(ak/bk + ... + a2/b2 + a/b + 1)
c n 1

T (n) aT n
cn n 1
b
So with k = logb n
T(n) = cn(ak/bk + ... + a2/b2 + a/b + 1)
What if a = b?
T(n) = cn(k + 1)

= cn(logb n + 1)
= (n log n)
c n 1

T (n) aT n
cn n 1
b
So with k = logb n
T(n) = cn(ak/bk + ... + a2/b2 + a/b + 1)
What if a < b?
c n 1

T (n) aT n
cn n 1
b
So with k = logb n
T(n) = cn(ak/bk + ... + a2/b2 + a/b + 1)
What if a < b?
Recall that (xk + xk-1 + + x + 1) = (xk+1 -1)/(x-1)
c n 1

T (n) aT n
cn n 1
b
So with k = logb n
T(n) = cn(ak/bk + ... + a2/b2 + a/b + 1)
What if a < b?

Recall that (xk + xk-1 + + x + 1) = (xk+1 -1)/(x-1)


So:
k
a a k 1
a
k 1 1
a b 1
k 1

1 a b
k 1

1
k
b b b a b 1 1 a b 1 a b
c n 1

T (n) aT n
cn n 1
b
So with k = logb n
T(n) = cn(ak/bk + ... + a2/b2 + a/b + 1)
What if a < b?
Recall that (xk + xk-1 + + x + 1) = (xk+1 -1)/(x-1)
So:
a k a k 1 a
k 1 1
a b 1

1 a b
k 1 k 1

1
k
b b b a b 1 1 a b 1 a b
T(n) = cn (1) = (n)
c n 1

T (n) aT n
cn n 1
b
So with k = logb n
T(n) = cn(ak/bk + ... + a2/b2 + a/b + 1)
What if a > b?
c n 1

T (n) aT n
cn n 1
b
So with k = logb n
T(n) = cn(ak/bk + ... + a2/b2 + a/b + 1)
What if a > b?

a k a k 1 a
k 1 1
a b k 1 1
a b
k

k
b b b a b 1
c n 1

T (n) aT n
cn n 1
b
So with k = logb n
T(n) = cn(ak/bk + ... + a2/b2 + a/b + 1)
What if a > b?
k
a a k 1
a
k 1 1
a b 1
k 1

a b
k

k
b b b
T(n) = cn (ak / bk)
a b 1
c n 1

T (n) aT n
cn n 1
b
So with k = logb n
T(n) = cn(ak/bk + ... + a2/b2 + a/b + 1)
What if a > b?
k
a a

k 1

a
1
a b 1
k 1
a b k

b k b k 1 b a b 1
T(n) = cn (ak / bk)

= cn (alog n / blog n) = cn (alog n / n)


c n 1

T (n) aT n
cn n 1
b
So with k = logb n
T(n) = cn(ak/bk + ... + a2/b2 + a/b + 1)
Whatk if ak > b?
a a

1

a
1
a b 1
k 1
a b k

b k b k 1 b a b 1
T(n) = cn (ak / bk)

= cn (alog n / blog n) = cn (alog n / n)


recall logarithm fact: alog n = nlog a
c n 1

T (n) aT n
cn n 1
b
So with k = logb n
T(n) = cn(ak/bk + ... + a2/b2 + a/b + 1)
What if a > b?
k
a a

k 1

a
1
a b 1
k 1
a b k

b k b k 1 b a b 1
T(n) = cn (ak / bk)

= cn (alog n / blog n) = cn (alog n / n)


recall logarithm fact: alog n = nlog a
= cn (nlog a / n) = (cn nlog a / n)
c n 1

T (n) aT n
cn n 1
b
So with k = logb n
T(n) = cn(ak/bk + ... + a2/b2 + a/b + 1)
What if a > b?
k
a a

k 1

a
1
a b 1
k 1
a b k

b k b k 1 b a b 1
T(n) = cn (ak / bk)

= cn (alog n / blog n) = cn (alog n / n)


recall logarithm fact: alog n = nlog a
= cn (nlog a / n) = (cn nlog a / n)
= (nlog a )
c n 1

T (n) aT n
cn n 1
b
So

n ab

T (n) n log b n ab

n logb a
ab
The Master Theorem

Given: a divide and conquer algorithm


An algorithm that divides the problem of size n into a
subproblems, each of size n/b
Let the cost of each stage (i.e., the work to divide the
problem + combine solved subproblems) be described
by the function f(n)

Then, the Master Theorem gives us a cookbook for the


algorithms running time:
The Master Theorem


if T(n) = aT(n/b) + f(n) then



n log b a
f ( n) O n log b a



0
T ( n) n log b a
log n f ( n) n log b a

c 1

f ( n)
f ( n) n log b a
AND

af (n / b) cf (n) for large n
Using The Master Method

T(n) = 9T(n/3) + n
a=9, b=3, f(n) = n
nlog a = nlog 9 = (n2)
b 3

Since f(n) = O(nlog 9 - ), where =1, case 1 applies:


3


T (n) n logb a when f (n) O n log b a
Thus the solution is T(n) = (n2)

You might also like