Asymptotic Notations

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 29

Asymptotic Notations

1
Big O notation
Let f and g be nonnegative functions on the
positive integers

We write
f(n)=O(g(n))
And say that
f(n) is of order at most g(n) or,
f(n) is big oh of g(n) or,
g is an asymptotic upper bound for f
if there exist constants C1>0 and N1 such that
f(n)  C1g(n),
for all n  N1
2
Example 1:
Is n =O(2n)?
f(n)<=c1g(n) for all n>=N1 and some constant c1>0
n<=c1* 2n yes always

Example 2:
Is 2n+1 =O(2n)?
f(n)<=c1g(n) for all n>=N1 and some constant c1>0
 2n+1<=c1*2n , Yes, if c1>=2 for all n

3
Example 3:
Is 22n =O(2n)?
f(n)<=c1g(n) for all n>=N1 and some constant c1>0
 22n<=c1*2n  2n*2n<=c1*2n2n<=c1? No

4
Theorem:
Let p(n)=aknk+ak-1nk-1+…+a1n+a0
p(n) is of degree k and (p(n)>=0 for all n)
Then p(n)= (nk)

5
Example 4:
Is 60n2+5n+1 =O(n2)?
f(n)<=c1g(n) for all n>=N1 and some constant c1>0
 60n2+5n+1 <=c1* n2
yes for c1=66 and n>=1

6
Notation

O() is set of functions.


But common to abuse notation, writing
T(n) = O(…)

instead of T(n)  O(…)

as well as T(n) = f(n) + O(…)

7
2n+13  O( ? ) O(n) Also, O(n2), … Can always
weaken the bound.
2n  O(n) not O(n).

T(n) = 32n2 + 17n + 32


we can ignore the multiplicative constants and the lower
order terms
T(n)=O(n2), O(n3), not O(n)
NOTE: O(n2) tight bound, O(n3) not tight bound,

8
Example
f(n)=5n3 f(n)=O(n3)
g(n)= 3n2 g(n)=O(n3)
 but f(n) not equal g(n)

9
Properties of Big Oh
If f1(n)=O(g1(n)) and f2(n)=O(g2(n)) , then
f1(n)+f2(n)=O(max(g1(n),g2(n)))

If f1(n)=O(g1(n)) and f2(n)=O(g2(n)) , then


f1(n)X f2(n)=O(g1(n)X g2(n))

f(n) = O(g(n)) & g(n) = O(h(n))  f(n) = O(h(n))

10
Order of growth
Suppose you have analyzed two algorithms and
expressed their run times in terms of the size of
the input:
– Algorithm A: takes 100 n + 1 steps to solve a
problem with size n;
– Algorithm B: takes n2 + n + 1 steps.

The leading term is the term with the highest


exponent.

11
The following table shows the run time of these
algorithms for different problem sizes:

Input size Run time of Algorithm A Run time of Algorithm B


n 100 n + 1 steps n2 + n + 1 steps

10 1 001 111

100 10 001 10 101

1 000 100 001 1 001 001

10 000 1 000 001 > 109

12
Notes
• At n=10, Algorithm A looks bad
– For Algorithm A, the leading term has a large
coefficient, 100, which is why B does better than A for
small n.

• But for n=100 they are about the same,

• for larger values of n, A is much better.


– any function that contains an n2 term will grow faster
than a function whose leading term is n.
– Even if the run time of Algorithm A were n + 1000000,
it would still be better than Algorithm B for sufficiently
large n.
13
How to compare algorithms?

• for large problems, we expect an algorithm with


a smaller leading term to be a better algorithm

• but for smaller problems, there may be a


crossover point where another algorithm is
better.
– The location of the crossover point depends on the
details of the algorithms, the inputs and the hardware,

14
How to compare algorithms?

• If two algorithms have the same leading order


term, it is hard to say which is better; the answer
will depend on the details.
– they are considered equivalent, even if they have
different coefficients.

15
Order of growth
An order of growth is a set of functions whose
growth is considered equivalent.
Examples:
– 2n, 100n and n + 1 belong to the same order of
growth, which is written O(n) in “Big-Oh notation”
– All functions with the leading term n2 belong to O(n2);

– What is the order of growth of n3 + n2?


– What about 1000000 n3 + n2. What about n3 +
1000000 n2?
– What is the order of growth of (n2 + n) * (n + 1)?

16
The following table shows some of the orders of
growth that appear most commonly in algorithmic
analysis, in increasing order of badness.
Order of growth Name
O(1) constant
O(logb n) logarithmic (for any b)
O(n) linear
O(n logb n) “en log en”
O(n2) quadratic
O(n3) cubic
O(ni ) for some i Polynomial time
O(cn) exponential (for any c)
17
Asymptotic Analysis of Algorithms
(Asymptotic  for large n)

big oh expressions greatly simplify the analysis of the


running time of algorithms
– all that we get is an upper bound on the running
time of the algorithm

– the result does not depend upon the values of the


constants
– the result does not depend upon the characteristics of
the computer and compiler actually used to execute
the program!

18
Conventions for Writing Big Oh Expressions
(Tight bound)

• Ignore the multiplicative constants


– Instead of writing O(3n2), we simply write O(n2)
– If the function is constant (e.g. O(1024) we write O(1))

• Ignore the lower order terms


– Instead of writing O(nlogn+n+n2), we simply write
O(n2)

19
Examples

T(n) = 32n2 + 17n + 32


T(n)=O(n2), O(n3), not O(n)
NOTE: O(n2) tight bound, O(n3) not tight bound,

– n, n+1, n+80, 40n, n+log n is O(n)


– n2 + 10000000000n is O(n2)
– 3n2 + 6n + log n + 24.5 is O(n2)

20
e.g.6 (revisited):

Horner(int a[], n, x){


c1 result=a[n]
O(1) for(i=n-1;i>=0,--i)
c2(n+1)O(n) result=result*x+a[i]
c3n O(n) return result
c4 O(1) }
T(n)=c1+c2(n+1)+c3n+c4
T(n)=a+bn for some constants a, b
T(n)=O(n) 21
e.g.8 (revisited): asymptotic analysis

fun (int x, int n){


O(1) Sum=0;
O(n) For(i=0n){
O(n) P=1
O(n2) For(j=0; j < n ;++j)
O(n2) P*=x
O(n) Sum+=p
}
}

T(n) = O(n2)
22
e.g.10 (revisited): Sequential search
• The worst case time
T(n) = an+b
• The best case time
T(n) = constant

Obtain asymptotic O bound on the solution?


• The worst case is O(n)
• The best case is O(1)

23
e.g.11: find a O-notation in terms of n for the number
of times the statement x=x+1 is executed in the
segment:
for (i=1 to n)
for (j=1 to i)
x=x+1
i j cn=1+2+3+…+n
1 11
=n(n+1) / 2
2 12
3 13
O(n2)
:
n 1n
24
e.g.12: find a O-notation in terms of n for the number of
times the statement x=x+1 is executed in the segment:
j=n
while ( j >= 1){ if 0  A  1
for (i=1 to j) n
1
x=x+1
j=j/2
i 0
i
A 
1 A
}
j i cn=n+n/2+n/4+…+ n/2k-1
n 1n =n(1+1/2+ 1/22+…+ 1/2k-1)
n/2 1n/2
=n(1/(1-0.5))
n/4 1n/4
:
O(n)
n/2k-1 1n/2 k-1

25
e.g.7 (revisited):Obtain asymptotic O bound for
recursive functions???
 Solving Recurrence Relations-Repeated
Substitution

T(n)=a if n=0
T(n)=T(n-1)+b if n>0
for some constants a, b
26
NOTES
• A problem that has a worst case Polynomial time
algorithm is considered to have a good algorithm.
– Such problems are called feasible or tractable

• A problem that does not have a worst case Polynomial


time algorithm is said to be intractable

• NP-complete problems
– No Polynomial time algorithm has been discovered
(intractable problems)

27
An Asymptotic Lower Bound-Omega
Let f and g be nonnegative functions on the
positive integers
We write
f(n)= (g(n))
And say that
f(n) is of order at least g(n) or,
f(n) is omega of g(n) or,
g is an asymptotic lower bound for f
if there exist constants C2>0 and N2 such that
f(n)  C2g(n),
for all n  N2
28
Let f and g be nonnegative functions on the positive
integers
We write
f(n)= (g(n))
And say that
f(n) is of order g(n) or,
f(n) is theta of g(n) or,
g is an asymptotic tight bound for f
if
f(n)=O(g(n)) and f(n)= (g(n))

29

You might also like