COT5407 Class03
COT5407 Class03
1E+30
Growth rates of 1E+28 Cubic
functions: 1E+26
1E+24 Quadratic
Linear n 1E+22
Linear
1E+20
Quadratic n2 1E+18
Cubic n3
T (n )
1E+16
1E+14
1E+12
In a log-log chart, 1E+10
1E+8
the slope of the line 1E+6
1E+4
corresponds to the 1E+2
growth rate of the 1E+0
1E+0 1E+2 1E+4 1E+6 1E+8 1E+10
function n
Analysis of Algorithms 1
Constant Factors
1E+26
The growth rate is 1E+24 Quadratic
Quadratic
not affected by 1E+22
1E+20 Linear
constant factors or 1E+18 Linear
lower-order terms 1E+16
1E+14
Examples T (n ) 1E+12
1E+10
102n 105 is a linear 1E+8
function 1E+6
105n2 108n is a 1E+4
quadratic function 1E+2
1E+0
1E+0 1E+2 1E+4 1E+6 1E+8 1E+10
n
Analysis of Algorithms 2
Big-Oh Notation
10,000
Given functions f(n) and 3n
g(n), we say that f(n) is 2n+10
1,000
O(g(n)) if there are
n
positive constants
c and n0 such that 100
Analysis of Algorithms 3
Big-Oh Example
1,000,000
n^2
Example: the function 100n
100,000
n2 is not O(n) 10n
n2 cn 10,000 n
nc
The above inequality 1,000
cannot be satisfied
since c must be a 100
constant
10
1
1 10 100 1,000
n
Analysis of Algorithms 4
More Big-Oh Examples
7n-2
7n-2 is O(n)
need c > 0 and n0 1 such that 7n-2 c•n for n n0
this is true for c = 7 and n0 = 1
3n3 + 20n2 + 5
3n3 + 20n2 + 5 is O(n3)
need c > 0 and n0 1 such that 3n3 + 20n2 + 5 c•n3 for n n0
this is true for c = 4 and n0 = 21
3 log n + log log n
3 log n + log log n is O(log n)
need c > 0 and n0 1 such that 3 log n + log log n c•log n for n n0
this is true for c = 4 and n0 = 2
Analysis of Algorithms 5
Big-Oh and Growth Rate
The big-Oh notation gives an upper bound on the
growth rate of a function
The statement “f(n) is O(g(n))” means that the growth
rate of f(n) is no more than the growth rate of g(n)
We can use the big-Oh notation to rank functions
according to their growth rate
Analysis of Algorithms 7
Relatives of Big-Oh
big-Omega
f(n) is (g(n)) if there is a constant c > 0
Analysis of Algorithms 8
Intuition for Asymptotic
Notation
Big-Oh
f(n) is O(g(n)) if f(n) is asymptotically less than or equal to g(n)
big-Omega
f(n) is (g(n)) if f(n) is asymptotically greater than or equal to g(n)
big-Theta
f(n) is (g(n)) if f(n) is asymptotically equal to g(n)
little-oh
f(n) is o(g(n)) if f(n) is asymptotically strictly less than g(n)
little-omega
f(n) is (g(n)) if is asymptotically strictly greater than g(n)
Analysis of Algorithms 9
Example Uses of the
Relatives of Big-Oh
5n2 is (n2)
f(n) is (g(n)) if there is a constant c > 0 and an integer constant n 0 1
such that f(n) c•g(n) for n n0
let c = 5 and n0 = 1
5n2 is (n)
f(n) is (g(n)) if there is a constant c > 0 and an integer constant n 0 1
such that f(n) c•g(n) for n n0
let c = 1 and n0 = 1
5n2 is (n)
f(n) is (g(n)) if, for any constant c > 0, there is an integer constant n 0
> 0 such that f(n) > c•g(n) for n n0
need 5n02 > c•n0 given c, the n0 that satifies this is n0 > c/5 > 0
Analysis of Algorithms 10
Divide-and-Conquer
Divide-and conquer is a
general algorithm design
paradigm:
Divide: divide the input data S in
two or more disjoint subsets S1,
S2 , …
Recur: solve the subproblems
recursively
Conquer: combine the solutions
for S1, S2, …, into a solution for S
The base case for the
recursion are subproblems of
constant size
Analysis can be done using
recurrence equations
Analysis of Algorithms 11
Merge-Sort Review
Merge-sort on an input
sequence S with n Algorithm mergeSort(S, C)
elements consists of Input sequence S with n
three steps: elements, comparator C
Divide: partition S into Output sequence S sorted
two sequences S1 and S2 according to C
of about n2 elements if S.size() > 1
each
(S1, S2) partition(S, n/2)
Recur: recursively sort S1
mergeSort(S1, C)
and S2
mergeSort(S2, C)
Conquer: merge S1 and
S2 into a unique sorted S merge(S1, S2)
sequence
Analysis of Algorithms 12
Recurrence Equation
Analysis
The conquer step of merge-sort consists of merging two sorted
sequences, each with n2 elements and implemented by means of
a doubly linked list, takes at most bn steps, for some constant b.
Likewise, the basis case (n < 2) will take at b most steps.
Therefore, if we let T(n) denote the running time of merge-sort:
b if n 2
T (n)
2T ( n / 2) bn if n 2
Analysis of Algorithms 13
Iterative Substitution
In the iterative substitution, or “plug-and-chug,” technique, we iteratively
apply the recurrence equation to itself and see if we can find a pattern:
T ( n ) 2T ( n / 2) bn
2( 2T ( n / 22 )) b( n / 2)) bn
22 T ( n / 22 ) 2bn
23 T ( n / 23 ) 3bn
24 T ( n / 24 ) 4bn
...
2i T ( n / 2i ) ibn
Note that base, T(n)=b, case occurs when 2i=n. That is, i = log n.
So,
T ( n ) bn bn log n
COT 5407 15
Solving Recurrences by
Substitution: 2
T(n) = 2T(n/2) + n
Guess (#2) T(n) = O(n2)
Need T(n) <= cn2 for some constant c>0
Assume T(n/2) <= cn2/4 Inductive hypothesis
Thus T(n) <= 2cn2/4 + n = cn2/2+ n
Works for all n as long as c>=2 !!
But there is a lot of “slack”
COT 5407 16
Solving Recurrences by
Substitution: 3
T(n) = 2T(n/2) + n
Guess (#3) T(n) = O(nlogn)
Need T(n) <= cnlogn for some constant c>0
Assume T(n/2) <= c(n/2)(log(n/2)) Inductive hypothesis
Thus T(n) <= 2 c(n/2)(log(n/2)) + n
<= cnlogn -cn + n <= cnlogn
Works for all n as long as c>=1 !!
This is the correct guess. WHY?
Show T(n) >= c’nlogn for some constant c’>0
Analysis of Algorithms 18
More Examples
Recall the recurrence equation:
b if n 2
T (n)
2T ( n / 2) bn log n if n 2
Guess #2: T(n) < cn log2 n.
T (n) 2T (n / 2) bn log n
2(c(n / 2) log 2 (n / 2)) bn log n
cn(log n log 2) 2 bn log n
cn log 2 n 2cn log n cn bn log n
cn log 2 n
if c > b.
So, T(n) is O(n log2 n).
In general, to use this method, you need to have a good guess
and you need to be good at induction
Analysis of Algorithmsproofs. 19
Solving Recurrences: Recursion-tree
method
Substitution method fails when a good guess is not available
Recursion-tree method works in those cases
Write down the recurrence as a tree with recursive calls as the
children
Expand the children
COT 5407 20
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
Analysis of Algorithms 23