0% found this document useful (0 votes)
10 views39 pages

Asymptotic Analysis

The document provides an overview of algorithm analysis, focusing on asymptotic analysis, which compares algorithms based on their running time and other factors as input size increases. It discusses different types of analysis (worst case, best case, average case) and introduces asymptotic notations (Big-O, Big-Omega, Big-Theta) to characterize algorithm performance. Additionally, it covers the importance of rate of growth and provides examples to illustrate the concepts.

Uploaded by

sudhagunagi9258
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views39 pages

Asymptotic Analysis

The document provides an overview of algorithm analysis, focusing on asymptotic analysis, which compares algorithms based on their running time and other factors as input size increases. It discusses different types of analysis (worst case, best case, average case) and introduces asymptotic notations (Big-O, Big-Omega, Big-Theta) to characterize algorithm performance. Additionally, it covers the importance of rate of growth and provides examples to illustrate the concepts.

Uploaded by

sudhagunagi9258
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 39

Analysis of Algorithms

Asymptotic Analysis
Analysis of Algorithms
• An algorithm is a finite set of precise instructions
for performing a computation or for solving a
problem.
• What is the goal of analysis of algorithms?
– To compare algorithms mainly in terms of running
time but also in terms of other factors (e.g., memory
requirements, programmer's effort etc.)
• What do we mean by running time analysis?
– Determine how running time increases as the size
of the problem increases.

2
Input Size

• Input size (number of elements in the input)


– size of an array

– polynomial degree

– # of elements in a matrix

– # of bits in the binary representation of the input

– vertices and edges in a graph

3
Types of Analysis
• Worst case
– Provides an upper bound on running time
– An absolute guarantee that the algorithm would not run longer,
no matter what the inputs are
• Best case
– Provides a lower bound on running time
– Input is the one for which the algorithm runs the fastest

Lower Bound Running Time Upper Bound


• Average case
– Provides a prediction about the running time
– Assumes that the input is random
4
How do we compare algorithms?
• We need to define a number of objective
measures.

(1) Compare execution times?


Not good: times are specific to a particular
computer !!

(2) Count the number of statements executed?


Not good: number of statements vary with
the programming language as well as the
style of the individual programmer.
5
Ideal Solution

• Express running time as a function of the


input size n (i.e., f(n)).
• Compare different functions corresponding
to running times.
• Such an analysis is independent of
machine time, programming style, etc.

6
Example
• Associate a "cost" with each statement.
• Find the "total cost“ by finding the total number of times
each statement is executed.
Algorithm 1 Algorithm 2

Cost Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c2
arr[1] = 0; c1 arr[i] = 0; c1
arr[2] = 0; c1
... ...
arr[N-1] = 0; c1
----------- -------------
c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 =
(c2 + c1) x N + c2

7
Another Example

• Algorithm 3 Cost
sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3
------------
c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2

8
Asymptotic Analysis
• To compare two algorithms with running
times f(n) and g(n), we need a rough
measure that characterizes how fast
each function grows.
• Hint: use rate of growth
• Compare functions in the limit, that is,
asymptotically!
(i.e., for large values of n)

9
Rate of Growth
• Consider the example of buying elephants and
fish:
Cost: cost_of_elephants + cost_of_fish
Cost ~ cost_of_elephants (approximation)
• The low order terms in a function are relatively
insignificant for large n
n4 + 100n2 + 10n + 50 ~ n4

i.e., we say that n4 + 100n2 + 10n + 50 and n4


have the same rate of growth

10
Common orders of magnitude

11
Common orders of magnitude

12
Asymptotic Notation

• O notation: asymptotic “less than”:

– f(n)=O(g(n)) implies: f(n) “≤” g(n)

  notation: asymptotic “greater than”:

– f(n)=  (g(n)) implies: f(n) “≥” g(n)

  notation: asymptotic “equality”:

– f(n)=  (g(n)) implies: f(n) “=” g(n)

13
Development of Notation

• Not concerned with small values of n


• Concerned with VERY LARGE values of n
• Asymptotic – refers to study of function f as n
approaches infinity

• Example: f(n) = n2+4n + 20


n2 is the dominant term and the term 4n + 20
becomes insignificant as n grows larger
Big-O Notation

• We say fA(n)=30n+8 is order n, or O (n)


It is, at most, roughly proportional to n.
• fB(n)=n2+1 is order n2, or O(n2). It is, at most,
roughly proportional to n2.
• In general, any O(n2) function is faster-
growing than any O(n) function.

15
Visualizing Orders of Growth
• On a graph, as
you go to the
right, a faster
growing

Value of function 
function fA(n)=30n+8
eventually
becomes
larger... fB(n)=n2+1

Increasing n 

16
More Examples …

• n4 + 100n2 + 10n + 50 is O(n4)


• 10n3 + 2n2 is O(n3)
• n3 - n2 is O(n3)
• constants
– 10 is O(1)
– 1273 is O(1)

17
Back to Our Example
Algorithm 1 Algorithm 2
Cost Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c2
arr[1] = 0; c1 arr[i] = 0; c1
arr[2] = 0; c1
...
arr[N-1] = 0; c1
----------- -------------
c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 =
(c2 + c1) x N + c2

• Both algorithms are of the same order: O(N)

18
Example (cont’d)

Algorithm 3 Cost
sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3
------------
c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2 = O(N2)

19
Asymptotic notations
• O-notation

20
Big-O Visualization

O(g(n)) is the set of


functions with smaller
or same order of
growth as g(n)

21
Examples

– 2n2 = O(n2n
3 2 ≤ cn3  2 ≤ cn  c = 1 and n = 2
): 0

n22):≤ cn2  c ≥ 1  c = 1 and n0= 1


– n2 = O(n

– 1000n2+1000n = O(n2):

000n2+1000n ≤ 1000n2+ n2 =1001n2 c=1001 and n0 = 10

n ≤
– n = O(n ):
2 cn 2
 cn ≥ 1  c = 1 and n0= 1

22
More Examples
• Show that 30n+8 is O(n).
– Show c,n0: 30n+8  cn, n>n0 .
• Let c=31, n0=8. Assume n>n0=8. Then
cn = 31n = 30n + n > 30n+8, so 30n+8 < cn.

23
Big-O example, graphically
• Note 30n+8 isn’t
less than n
anywhere (n>0).
cn =
• It isn’t even
31n 30n+8

Value of function 
less than 31n
everywhere.
• But it is less than
31n everywhere to
30n+8
n
the right of n=8. O(n)
n>n0=8 
Increasing n 

24
No Uniqueness
• There is no unique set of values for n0 and c in proving the
asymptotic bounds
• Prove that 100n + 5 = O(n2)
– 100n + 5 ≤ 100n + n = 101n ≤ 101n2

for all n ≥ 5

n0 = 5 and c = 101 is a solution

– 100n + 5 ≤ 100n + 5n = 105n ≤ 105n 2


for all n ≥ 1

n0 = 1 and c = 105 is also a solution

Must find SOME constants c and n0 that satisfy the asymptotic notation relation
25
Big-O Notation
Some Big-O complexity classes in order of magnitude from smallest to highest:

O(1) Constant
O(log(n)) Logarithmic
O(n) Linear
O(n log(n)) n log n
O(nx) {e.g., O(n2), O(n3), etc} Polynomial

O(an) {e.g., O(1.6n), O(2n), etc} Exponential


O(n!) Factorial
O(nn)
Asymptotic notations (cont.)
  - notation

(g(n)) is the set of functions


with larger or same order of
growth as g(n)

27
The Big-Omega Notation

The inverse of Big-Oh is Ω

If g(n) = O(f(n)),
then f(n) = Ω (g(n))

f(n) grows faster or with the


same rate as g(n): f(n) = Ω
(g(n))
Examples
– 5n2 = (n)
 cn
 c, n0 such that: 0  cn  5n 2  5n2 c = 1 and n0 = 1

– 100n + 5 ≠ (n2)
 c, n0 such that: 0  cn2  100n + 5
100n + 5  100n + 5n ( n  1) = 105n
cn2  105n
 n(cn – 105)  0
 n  105/c
Since n is positive  cn – 105  0
 contradiction: n cannot be smaller than a constant

– n = (2n), n3 = (n2), n = (logn)


29
Asymptotic notations (cont.)
 -notation

(g(n)) is the set of functions


with the same order of growth
as g(n)

30
Examples
– n2/2 –n/2 = (n2)

• ½ n2 - ½ n ≤ ½ n2 n ≥ 0  c2= ½

• ½ n2 - ½ n ≥ ½ n2 - ½ n * ½ n ( n ≥ 2 ) = ¼ n2 

c1= ¼

– n ≠ (n2): c1 n2 ≤ n ≤ c2 n2

 only holds for: n ≤ 1/c1


31
Examples
– 6n3 ≠ (n2): c1 n2 ≤ 6n3 ≤ c2 n2

 only holds for: n ≤ c2 /6

– n ≠ (logn): c1 logn ≤ n ≤ c2 logn

 c2 ≥ n/logn,  n≥ n0 – impossible

32
Relations Between Different Sets
• Subset relations between order-of-growth sets.

RR
O( f ) ( f )
•f
( f )

33
Little oh

f(n) grows slower than g(n)


(or g(n) grows faster than f(n))
if

lim( f(n) / g(n) ) = 0, n→∞

Notation: f(n) = o( g(n) )


pronounced "little oh"
Little omega

f(n) grows faster than


g(n)
(or g(n) grows slower than
f(n))
if

lim( f(n) / g(n) ) = ∞, n ->



Logarithms and properties
• In algorithm analysis we often use the notation “log n”
without specifying the base

Binary logarithm lg n log2 n log x y  y log x


Natural logarithm ln n loge n log xy  log x  log y
x
lg k n (lg n ) k log  log x  log y
y
lg lg n lg(lg n ) log a
a logb x  x b

log b x  log a x
log a b

36
More Examples
• For each of the following pairs of functions, either f(n) is
O(g(n)), f(n) is Ω(g(n)), or f(n) = Θ(g(n)). Determine
which relationship is correct.
– f(n) = log n2; g(n) = log n + 5 f(n) =  (g(n))
– f(n) = n; g(n) = log n2 f(n) = (g(n))
– f(n) = log log n; g(n) = log n f(n) = O(g(n))
– f(n) = n; g(n) = log2 n f(n) = (g(n))
– f(n) = n log n + n; g(n) = log nf(n) = (g(n))
– f(n) = 10; g(n) = log 10 f(n) = (g(n))
– f(n) = 2n; g(n) = 10n2 f(n) = (g(n))
– f(n) = 2n; g(n) = 3n f(n) = O(g(n))
37
Properties
• Theorem:
f(n) = (g(n))  f = O(g(n)) and f = (g(n))
• Transitivity:
– f(n) = (g(n)) and g(n) = (h(n))  f(n) = (h(n))
– Same for O and 
• Reflexivity:
– f(n) = (f(n))
– Same for O and 
• Symmetry:
– f(n) = (g(n)) if and only if g(n) = (f(n))
• Transpose symmetry:
– f(n) = O(g(n)) if and only if g(n) = (f(n))
38
Asymptotic Notations in Equations
• On the right-hand side
 (n2) stands for some anonymous function in (n2)
2n2 + 3n + 1 = 2n2 + (n) means:
There exists a function f(n)  (n) such that
2n2 + 3n + 1 = 2n2 + f(n)
• On the left-hand side
2n2 + (n) = (n2)
No matter how the anonymous function is chosen on
the left-hand side, there is a way to choose the
anonymous function on the right-hand side to make
the equation valid.

39

You might also like