0% found this document useful (0 votes)
62 views23 pages

COT5407 Class03

Log-log chart: the slope of the line corresponds to the growth rate of the function. Growth rate is not affected by constant factors or lower-order terms. A function n 2 is not O(n) if there are positive constants c and n 0.

Uploaded by

Manju Appukuttan
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
62 views23 pages

COT5407 Class03

Log-log chart: the slope of the line corresponds to the growth rate of the function. Growth rate is not affected by constant factors or lower-order terms. A function n 2 is not O(n) if there are positive constants c and n 0.

Uploaded by

Manju Appukuttan
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 23

Growth Rates

1E+30
Growth rates of 1E+28 Cubic
functions: 1E+26
1E+24 Quadratic
 Linear  n 1E+22
Linear
1E+20
 Quadratic  n2 1E+18
Cubic  n3
T (n )
 1E+16
1E+14
1E+12
In a log-log chart, 1E+10
1E+8
the slope of the line 1E+6
1E+4
corresponds to the 1E+2
growth rate of the 1E+0
1E+0 1E+2 1E+4 1E+6 1E+8 1E+10
function n

Analysis of Algorithms 1
Constant Factors
1E+26
The growth rate is 1E+24 Quadratic
Quadratic
not affected by 1E+22
1E+20 Linear
 constant factors or 1E+18 Linear
 lower-order terms 1E+16
1E+14
Examples T (n ) 1E+12
1E+10
 102n  105 is a linear 1E+8
function 1E+6
 105n2  108n is a 1E+4
quadratic function 1E+2
1E+0
1E+0 1E+2 1E+4 1E+6 1E+8 1E+10
n

Analysis of Algorithms 2
Big-Oh Notation
10,000
Given functions f(n) and 3n
g(n), we say that f(n) is 2n+10
1,000
O(g(n)) if there are
n
positive constants
c and n0 such that 100

f(n)  cg(n) for n  n0


10
Example: 2n  10 is O(n)
 2n  10  cn 1
 (c  2) n 10 1 10 100 1,000
 n 10(c  2) n
 Pick c 3 and n0 10

Analysis of Algorithms 3
Big-Oh Example
1,000,000
n^2
Example: the function 100n
100,000
n2 is not O(n) 10n
 n2  cn 10,000 n
 nc
 The above inequality 1,000
cannot be satisfied
since c must be a 100
constant
10

1
1 10 100 1,000
n

Analysis of Algorithms 4
More Big-Oh Examples
7n-2
7n-2 is O(n)
need c > 0 and n0  1 such that 7n-2  c•n for n  n0
this is true for c = 7 and n0 = 1
 3n3 + 20n2 + 5
3n3 + 20n2 + 5 is O(n3)
need c > 0 and n0  1 such that 3n3 + 20n2 + 5  c•n3 for n  n0
this is true for c = 4 and n0 = 21
 3 log n + log log n
3 log n + log log n is O(log n)
need c > 0 and n0  1 such that 3 log n + log log n  c•log n for n  n0
this is true for c = 4 and n0 = 2
Analysis of Algorithms 5
Big-Oh and Growth Rate
The big-Oh notation gives an upper bound on the
growth rate of a function
The statement “f(n) is O(g(n))” means that the growth
rate of f(n) is no more than the growth rate of g(n)
We can use the big-Oh notation to rank functions
according to their growth rate

f(n) is O(g(n)) g(n) is O(f(n))


g(n) grows more Yes No
f(n) grows more No Yes
Same growth Yes Yes
Analysis of Algorithms 6
Big-Oh Rules
If is f(n) a polynomial of degree d, then f(n) is
O(nd), i.e.,
1. Drop lower-order terms
2. Drop constant factors
Use the smallest possible class of functions
 Say “2n is O(n)” instead of “2n is O(n2)”
Use the simplest expression of the class
 Say “3n  5 is O(n)” instead of “3n  5 is O(3n)”

Analysis of Algorithms 7
Relatives of Big-Oh
big-Omega
 f(n) is (g(n)) if there is a constant c > 0

and an integer constant n0  1 such that


f(n)  c•g(n) for n  n0
big-Theta
 f(n) is (g(n)) if there are constants c’ > 0 and c’’ > 0 and an

integer constant n0  1 such that c’•g(n)  f(n)  c’’•g(n) for n  n0


little-oh
 f(n) is o(g(n)) if, for any constant c > 0, there is an integer

constant n0 > 0 such that f(n) < c•g(n) for n  n0


little-omega
 f(n) is (g(n)) if, for any constant c > 0, there is an integer

constant n0 > 0 such that f(n) > c•g(n) for n  n0

Analysis of Algorithms 8
Intuition for Asymptotic
Notation
Big-Oh
 f(n) is O(g(n)) if f(n) is asymptotically less than or equal to g(n)

big-Omega
 f(n) is (g(n)) if f(n) is asymptotically greater than or equal to g(n)

big-Theta
 f(n) is (g(n)) if f(n) is asymptotically equal to g(n)

little-oh
 f(n) is o(g(n)) if f(n) is asymptotically strictly less than g(n)

little-omega
 f(n) is (g(n)) if is asymptotically strictly greater than g(n)

Analysis of Algorithms 9
Example Uses of the
Relatives of Big-Oh
 5n2 is (n2)
f(n) is (g(n)) if there is a constant c > 0 and an integer constant n 0  1
such that f(n)  c•g(n) for n  n0
let c = 5 and n0 = 1
 5n2 is (n)
f(n) is (g(n)) if there is a constant c > 0 and an integer constant n 0  1
such that f(n)  c•g(n) for n  n0
let c = 1 and n0 = 1
 5n2 is (n)
f(n) is (g(n)) if, for any constant c > 0, there is an integer constant n 0
> 0 such that f(n) > c•g(n) for n  n0
need 5n02 > c•n0  given c, the n0 that satifies this is n0 > c/5 > 0

Analysis of Algorithms 10
Divide-and-Conquer
Divide-and conquer is a
general algorithm design
paradigm:
 Divide: divide the input data S in
two or more disjoint subsets S1,
S2 , …
 Recur: solve the subproblems
recursively
 Conquer: combine the solutions
for S1, S2, …, into a solution for S
The base case for the
recursion are subproblems of
constant size
Analysis can be done using
recurrence equations
Analysis of Algorithms 11
Merge-Sort Review
Merge-sort on an input
sequence S with n Algorithm mergeSort(S, C)
elements consists of Input sequence S with n
three steps: elements, comparator C
 Divide: partition S into Output sequence S sorted
two sequences S1 and S2 according to C
of about n2 elements if S.size() > 1
each
(S1, S2)  partition(S, n/2)
 Recur: recursively sort S1
mergeSort(S1, C)
and S2
mergeSort(S2, C)
 Conquer: merge S1 and
S2 into a unique sorted S  merge(S1, S2)
sequence

Analysis of Algorithms 12
Recurrence Equation
Analysis
The conquer step of merge-sort consists of merging two sorted
sequences, each with n2 elements and implemented by means of
a doubly linked list, takes at most bn steps, for some constant b.
Likewise, the basis case (n < 2) will take at b most steps.
Therefore, if we let T(n) denote the running time of merge-sort:

 b if n  2
T (n)  
2T ( n / 2)  bn if n  2

We can therefore analyze the running time of merge-sort by


finding a closed form solution to the above equation.
 That is, a solution that has T(n) only on the left-hand side.

Analysis of Algorithms 13
Iterative Substitution
In the iterative substitution, or “plug-and-chug,” technique, we iteratively
apply the recurrence equation to itself and see if we can find a pattern:
T ( n )  2T ( n / 2)  bn
 2( 2T ( n / 22 ))  b( n / 2))  bn
 22 T ( n / 22 )  2bn
 23 T ( n / 23 )  3bn
 24 T ( n / 24 )  4bn
 ...
 2i T ( n / 2i )  ibn
Note that base, T(n)=b, case occurs when 2i=n. That is, i = log n.
So,
T ( n )  bn  bn log n

Thus, T(n) is O(n log n).


Analysis of Algorithms 14
Solving Recurrences by
Substitution: Guess-and-Test
Guess the form of the solution
(Using mathematical induction) find the constants and
show that the solution works
Example
T(n) = 2T(n/2) + n
Guess (#1) T(n) = O(n)
Need T(n) <= cn for some constant c>0
Assume T(n/2) <= cn/2 Inductive hypothesis
Thus T(n) <= 2cn/2 + n = (c+1) n
Our guess was wrong!!

COT 5407 15
Solving Recurrences by
Substitution: 2
T(n) = 2T(n/2) + n
Guess (#2) T(n) = O(n2)
Need T(n) <= cn2 for some constant c>0
Assume T(n/2) <= cn2/4 Inductive hypothesis
Thus T(n) <= 2cn2/4 + n = cn2/2+ n
Works for all n as long as c>=2 !!
But there is a lot of “slack”

COT 5407 16
Solving Recurrences by
Substitution: 3
T(n) = 2T(n/2) + n
Guess (#3) T(n) = O(nlogn)
Need T(n) <= cnlogn for some constant c>0
Assume T(n/2) <= c(n/2)(log(n/2)) Inductive hypothesis
Thus T(n) <= 2 c(n/2)(log(n/2)) + n
<= cnlogn -cn + n <= cnlogn
Works for all n as long as c>=1 !!
This is the correct guess. WHY?
Show T(n) >= c’nlogn for some constant c’>0

9/9/08 COT 5407 17


More Examples
In the guess-and-test method, we guess a closed form solution
and then try to prove it is true by induction:
 b if n  2
T (n)  
2T ( n / 2)  bn log n if n  2
Guess: T(n) < cn log n.
T ( n )  2T ( n / 2)  bn log n
 2( c( n / 2) log(n / 2))  bn log n
 cn (log n  log 2)  bn log n
 cn log n  cn  bn log n

Wrong: we cannot make this last line be less than cn log n

Analysis of Algorithms 18
More Examples
Recall the recurrence equation:
 b if n  2
T (n)  
2T ( n / 2)  bn log n if n  2
Guess #2: T(n) < cn log2 n.
T (n)  2T (n / 2)  bn log n
 2(c(n / 2) log 2 (n / 2))  bn log n
 cn(log n  log 2) 2  bn log n
 cn log 2 n  2cn log n  cn  bn log n
 cn log 2 n

 if c > b.
So, T(n) is O(n log2 n).
In general, to use this method, you need to have a good guess
and you need to be good at induction
Analysis of Algorithmsproofs. 19
Solving Recurrences: Recursion-tree
method
Substitution method fails when a good guess is not available
Recursion-tree method works in those cases
 Write down the recurrence as a tree with recursive calls as the

children
 Expand the children

 Add up each level

 Sum up the levels

Useful for analyzing divide-and-conquer algorithms


Also useful for generating good guesses to be used by substitution
method

COT 5407 20
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.

9/9/08 COT 5407 21


Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.

9/9/08 COT 5407 22


Master Method
Many divide-and-conquer recurrence equations have
the form:
 c if n  d
T (n)  
aT ( n / b)  f ( n ) if n  d

The Master Theorem:


1. if f (n) is O(n logb a  ), then T (n) is (n logb a )
2. if f (n) is (n logb a log k n), then T (n) is (n logb a log k 1 n)
3. if f (n) is (n logb a  ), then T (n) is ( f (n)),
provided af (n / b)  f (n) for some   1.

Analysis of Algorithms 23

You might also like