Asymptotic Notations
Asymptotic Notations
Asymptotic Notations
1
Big O notation
Let f and g be nonnegative functions on the
positive integers
We write
f(n)=O(g(n))
And say that
f(n) is of order at most g(n) or,
f(n) is big oh of g(n) or,
g is an asymptotic upper bound for f
if there exist constants C1>0 and N1 such that
f(n) C1g(n),
for all n N1
2
Example 1:
Is n =O(2n)?
f(n)<=c1g(n) for all n>=N1 and some constant c1>0
n<=c1* 2n yes always
Example 2:
Is 2n+1 =O(2n)?
f(n)<=c1g(n) for all n>=N1 and some constant c1>0
2n+1<=c1*2n , Yes, if c1>=2 for all n
3
Example 3:
Is 22n =O(2n)?
f(n)<=c1g(n) for all n>=N1 and some constant c1>0
22n<=c1*2n 2n*2n<=c1*2n2n<=c1? No
4
Theorem:
Let p(n)=aknk+ak-1nk-1+…+a1n+a0
p(n) is of degree k and (p(n)>=0 for all n)
Then p(n)= (nk)
5
Example 4:
Is 60n2+5n+1 =O(n2)?
f(n)<=c1g(n) for all n>=N1 and some constant c1>0
60n2+5n+1 <=c1* n2
yes for c1=66 and n>=1
6
Notation
7
2n+13 O( ? ) O(n) Also, O(n2), … Can always
weaken the bound.
2n O(n) not O(n).
8
Example
f(n)=5n3 f(n)=O(n3)
g(n)= 3n2 g(n)=O(n3)
but f(n) not equal g(n)
9
Properties of Big Oh
If f1(n)=O(g1(n)) and f2(n)=O(g2(n)) , then
f1(n)+f2(n)=O(max(g1(n),g2(n)))
10
Order of growth
Suppose you have analyzed two algorithms and
expressed their run times in terms of the size of
the input:
– Algorithm A: takes 100 n + 1 steps to solve a
problem with size n;
– Algorithm B: takes n2 + n + 1 steps.
11
The following table shows the run time of these
algorithms for different problem sizes:
10 1 001 111
12
Notes
• At n=10, Algorithm A looks bad
– For Algorithm A, the leading term has a large
coefficient, 100, which is why B does better than A for
small n.
14
How to compare algorithms?
15
Order of growth
An order of growth is a set of functions whose
growth is considered equivalent.
Examples:
– 2n, 100n and n + 1 belong to the same order of
growth, which is written O(n) in “Big-Oh notation”
– All functions with the leading term n2 belong to O(n2);
16
The following table shows some of the orders of
growth that appear most commonly in algorithmic
analysis, in increasing order of badness.
Order of growth Name
O(1) constant
O(logb n) logarithmic (for any b)
O(n) linear
O(n logb n) “en log en”
O(n2) quadratic
O(n3) cubic
O(ni ) for some i Polynomial time
O(cn) exponential (for any c)
17
Asymptotic Analysis of Algorithms
(Asymptotic for large n)
18
Conventions for Writing Big Oh Expressions
(Tight bound)
19
Examples
20
e.g.6 (revisited):
T(n) = O(n2)
22
e.g.10 (revisited): Sequential search
• The worst case time
T(n) = an+b
• The best case time
T(n) = constant
23
e.g.11: find a O-notation in terms of n for the number
of times the statement x=x+1 is executed in the
segment:
for (i=1 to n)
for (j=1 to i)
x=x+1
i j cn=1+2+3+…+n
1 11
=n(n+1) / 2
2 12
3 13
O(n2)
:
n 1n
24
e.g.12: find a O-notation in terms of n for the number of
times the statement x=x+1 is executed in the segment:
j=n
while ( j >= 1){ if 0 A 1
for (i=1 to j) n
1
x=x+1
j=j/2
i 0
i
A
1 A
}
j i cn=n+n/2+n/4+…+ n/2k-1
n 1n =n(1+1/2+ 1/22+…+ 1/2k-1)
n/2 1n/2
=n(1/(1-0.5))
n/4 1n/4
:
O(n)
n/2k-1 1n/2 k-1
25
e.g.7 (revisited):Obtain asymptotic O bound for
recursive functions???
Solving Recurrence Relations-Repeated
Substitution
T(n)=a if n=0
T(n)=T(n-1)+b if n>0
for some constants a, b
26
NOTES
• A problem that has a worst case Polynomial time
algorithm is considered to have a good algorithm.
– Such problems are called feasible or tractable
• NP-complete problems
– No Polynomial time algorithm has been discovered
(intractable problems)
27
An Asymptotic Lower Bound-Omega
Let f and g be nonnegative functions on the
positive integers
We write
f(n)= (g(n))
And say that
f(n) is of order at least g(n) or,
f(n) is omega of g(n) or,
g is an asymptotic lower bound for f
if there exist constants C2>0 and N2 such that
f(n) C2g(n),
for all n N2
28
Let f and g be nonnegative functions on the positive
integers
We write
f(n)= (g(n))
And say that
f(n) is of order g(n) or,
f(n) is theta of g(n) or,
g is an asymptotic tight bound for f
if
f(n)=O(g(n)) and f(n)= (g(n))
29