0% found this document useful (0 votes)
25 views47 pages

Asymptotic Analysis and Growth of Functions

Uploaded by

Simran rana
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views47 pages

Asymptotic Analysis and Growth of Functions

Uploaded by

Simran rana
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 47

Analysis and Design of Algorithms

• Growth of Functions
• Asymptotic Notations
• Running Time Calculations

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya
Contents

 After completion of this presentation, you will be able to


 Understand growth of functions and their performance analysis
 Understand time complexity
 Understand Asymptotic Notations
 Compute running time of an algorithm or a given code

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 2
ALGORITHM ANALYSIS

 Many factors contribute to the running time calculation of an algorithm:

 speed of central processing unit, data bus type and other hardware
 Design, programming, and testing time taken
 Computer programming language and developer’s coding proficiency
 Input of the program
 Type of platform – desktop, web or mobile
 Operating system type and version

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 3
ALGORITHM ANALYSIS
 There are two steps involved in Algorithm Analysis:

1. Find T(n), the running time of an algorithm in terms of input n. It is either given or have
to be computed from the given block of code/pseudo code.

2. Use T(n) to perform asymptotic analysis.

 We will initially discuss step 2, that is, use T(n) to perform Asymptotic Analysis. Later, we
will discuss step 1, how to find T(n) from a given block of code/pseudo code

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 4
ALGORITHM ANALYSIS
 An algorithms performance can be estimated with the number of operation it performs for a
given input size.

 Hence, for an input variable n, we calculate the running time of an algorithm as a function
T(n).

 Using different inputs, we observe the running time of our function T(n) and see if the graph
is crossing somewhere as compared to a standard function whose growth rate we already
know.

 This is called the asymptotic growth rate of a function, that is, observe the growth rate of a
function as the value of n becomes very large, or approaches infinity.

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 5
ALGORITHM ANALYSIS

 Hence, with the help of simple standard functions, we can asymptotically compare, and
approximate a given function in hand.

 By comparing the growth rate of our function and the standard function, we are able to
understand how best the standard function can approximate our function in hand.

 So, we are able to estimate the running the running time of two functions by observing their
growth rate. This saves us time, as we do not have to write programs every time to implement
the functions.

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 6
The Big O Notation  (O)
 The Big O symbol was introduced in the year 1927 to determine the relative growth rate of
given two functions based on their asymptotic behavior.

 Definition of (O): Assume T and F are 2 given functions.


 Then T (n) = OF(n) if there exist a natural number n0 and a +ve real constant c (which is
dependent on the value of n0) such that ∀ natural numbers n that are larger than n0,
T(n) < c*F(n).

 For example. if T(n) =10n and F(n)=n2, n0 > 10 and c is equal to 1


then T(n0) < 1(F(n0) and so we can argue that T(n) = Big O of F(n) or T(n) = OF(n)

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 7
The Big O Notation  (O)
 Examples:

 4n+2=O(n) as 4n+2<=5n for all n>=2

 3n2 +2n +1 = O(n2) as 3n2 +2n +1 <= 4n2 for all n>=3

 4n+3 != O(1) as 4n+3 is never <= any constant c for all n>=n0

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 8
The Big O Notation  (O)
 The statement T(n) = O(F(n)) hence means that F(n) is kind of an upper bound on the
value of T(n) ∀ n >=n0.

 However, it doesn’t really show how good this upper bound is.

 So, we when we say T(n) = O(F(n)) to be informative, to be really effective, F(n) should
be as small a function (tightly bound) of n that can be for which T(n) = O(F(n)).

 So, it is appropriate to say that 4n+2= O(n).

 However, if we say 4n+2=O(n2), this is a loose upper bound with a quadratic function,
even though our statement 4n+2=O(n2) is correct.

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 9
The Big O Notation  (O)
Some other rules to remember:
 Constants can be ignored: O(3n2) = O(n2)

 Lower order terms in a polynomial function can be ignored, and we only compare relative
growth.

 Example: O(n3+3n2+4) = O(n3)

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 10
Some Standard Functions: Logarithmic function

 A function T(n) is of at most logarithmic growth if T(n) = O(log n)

 This function is commonly used in programs that solves a big problem by transforming it into
a smaller problem by cutting the size by some constant fraction (recursive sorting, searching,
etc.).

 Whenever value of n doubles, log n increases by a constant value, but log n does not double
until n becomes n2.

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 11
Some Standard Functions: Linear function

 A function T(n) is of at most linear growth if T(n) = O(n)

 As the value of n increases, running time T(n) grows in the same proportion.
Doubling n more or less doubles the running time of the algorithm.

 Standard looping in algorithms where for n values n outputs are produced :

 (e.g. n operations in a loop from 1-n).

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 12
Some Standard Functions: Quadratic function

 A function T(n) is of at most quadratic growth if T(n) = O(n2)

 This is a situation when an algorithm has two variables getting processed in a


nested loop. So when n doubles, running time of algorithm increases fourfold.

 When value of variable n=1000, running time of algorithm is 1 million units of


time.

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 13
Some Standard Functions: Polynomial function

 A function T(n) is of at most polynomial growth if T(n) = O(nm), for a natural


number, m > 1.

 For larger values of m, a polynomial growth is practical only for small problems.

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 14
Some Standard Functions: Exponential function

 A function T(n) is of at most exponential growth if there exists a constant c, such


that T(n) = O(cn), and c > 1.

 As n increases, T(n) heads towards infinity.

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 15
Some Standard Functions: Factorial function

 A function T(n) is of at most factorial growth if T(n) = O(n!).

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 16
Some Standard Functions: Constant function

 A function f(n) has constant running time if the size of the input n bears no effect
on the running time of the algorithm.

 The running time for such algorithm, T(n) = c, where c is a constant.

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 17
Running time for various algorithms

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 18
Blow-up of the bottom left corner of the previous graph

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 19
The Omega Notation  (Ω)
 Definition of (Ω): Assume T and F are 2 given functions.

 Then T(n)= OF(n) if there exist a natural number n0 and a +ve real constant c (which is
dependent on the value of n0) such that ∀ natural numbers n that are larger than n0,
T(n) >= c*F(n).

 In simple terms, T(n) = Ω(F(n)) means F(n) is a lower bound on T(n)

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 20
The Omega Notation  (Ω)
 Example:
If we take T(n) as 3n+2, and F(n) = n, then
3n + 2 = Ω(n) can be shown if we can find c and n0
such that 3n +2 >= c*n.

This is true for c=1 and no=1

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 21
The theta Notation  (θ)

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 22
The theta Notation  (θ)
 T(n)= θ(F(n)) if and only if T(n)=O(f(n)) and T(N)= Ω(f(n))

 In simple terms, T(n)=θ(F(n)) means growth rate of T(n) equals growth rate of F(n)

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 23
The little o Notation  (o)

 T(n) = o(F(n)) if T(n) = O(F(n)) and T(n) not equal θ(F(n))

 Little o notation implies that function T's growth rate is strictly less than that of
function F

 Example:
T(n) = log(n) and F(n) = n2

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 24
Summary of behavior of Asymptotic Notations

 Big O represent the asymptotic behavior of a function by < relationship


 Big Theta represent the asymptotic behavior of a function by = relationship
 Omega represent the asymptotic behavior of a function by > relationship
 Little o represent the asymptotic behavior of a function by < relationship

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 25
Some Theorems Regarding Notations

 If T1 = O(F) and T2 = O(G) where all functions have positive real value, then

 (a) T1 + T2 = max(O(F),O(G))
 (b) T1 * T2 = O(F * G)

 If T1(y) is a polynomial of degree n, then T1(y) = θ(yn)

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 26
Some Theorems Regarding Notations

 If F(n) is O(G(n)) and G(n) is O(H(n)) then F(n) is O(H(n))

 If F(n) is O(c*G(n)) for any constant c > 0 then F(n) is O(G(n))

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 27
Points to remember for Big O Notation

 O(G(n)) denotes the set of all functions F(n) such that F(n) < c * G(n), for a
constant c > 0.

 Hence, if we say “F(n) is O(G(n))“, this means F(n) is a member of the set of
functions O(G(n)).

 Which subsequently means, we can state things such as n2 + O( n ) is O( n2 )

 In general, we can state statement such as, for all c > 0, there is a constant d > 0
such that n2 + cn < dn2, for all n > 0

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 28
Some Examples
 4n3 + 2n2 + n
= 4n3 + 2n2 + O(n)
= 4n3 + O( n2 + n)
= 4n3 + O( n2 )
= O(n3 )
= O(n4)

 4n3 + 2n2 + n
= 4n3 + 2n2 + O(n)
= 4n3 + θ( n2 + n)
= 4n3 + θ( n2 )
= θ(n3 )

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 29
Some other rules to remember on Algorithmic Analysis
 Constants or lower order terms can be ignored

 For example, F(n) = 3n2 then F(n) = O(n2)

 Most important variable to analyze is running time of an algorithm

 Some other factors that affects performance are (a) algorithm used and (b) kind of
input to the algorithm

 Parameter n, usually referring to number of data items for input, affects running
time most significantly.

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 30
Steps involved in Algorithmic Analysis
 There are two steps involved in Algorithm Analysis:

1. Find T(n), the running time of an algorithm in terms of input n. It is either given or have
to be computed from the given block of code/pseudo code.

2. Use T(n) to perform asymptotic analysis.

 We have already discussed step 2, that is, using T(n) to perform Asymptotic Analysis. Now
we discuss step 1, how to find T(n) from a given block of code/pseudo code

 An important step in this process is to review the summation series and relook at important
functions.

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 31
Common Functions
 Exponential functions:

11
a 
a
(a )  a
m n mn

m n
a a a
m n

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 32
Common Functions
 Logarithmic Functions: log c (ab)  log c a  log c b
 x = logba is the log b a n  n log b a
exponent for a = bx.
log c a
 Natural log: ln a = logea log b a 
log c b
 Binary log: lg a = log2a
log b (1 / a )   log b a
 lg2a = (lg a)2
1
 lg lg a = lg (lg a) log b a 
log a b
a logb c  c logb a

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 33
Common Functions
 If the base of a logarithm is changed from one constant to another, the value is altered
by a constant factor.
 Ex: log10 n * log210 = log2 n.
 Base of logarithm is not an issue in asymptotic notation.

 lg(n!) = (n lg n)
 Can be proven using Stirling’s approximation for lg(n!).

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 34
Summations
 Constant Series: For integers a and b, a  b,
b

1  b  a  1
i a

 Linear Series (Arithmetic Series): For n  0,


n
n(n  1)

i 1
i  1  2   n 
2

 Quadratic Series: For n  0,


n
n(n  1)(2n  1)
 i 2  12  22    n 2 
i 1 6

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 35
Summations

 Cubic Series: For n  0,


n
n 2
( n  1) 2


i 1
i 3
 13
 2 3
   n 3

4

 Geometric Series: For real x  1,


n1
n
x 1

k 0
x k
 1  x  x 2
   x n

x 1

For |x| < 1, 


1

k 0
x  k

1 x

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 36
Calculate Running Time of an Algorithm/Pseudocode

 To calculate running time of an algorithm, a program fragment, or a pseudo code,


we consider key and basic operations involved in the execution of the algorithm

 Depending upon the algorithm, this is usually


a) the number of comparisons, and/or
b) the number of mathematical operations that are performed, mostly in
statements like loops and basic operations within those loops.

 Assumptions to be taken into this process:


 Replace loops by suitable summations
 Count Integer Operations in the program, with each operations as one unit of time
 Ignore array accesses
 Ignore assignment operations

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 37
Calculate Running Time of an Algorithm/Pseudocode

 Example: Find running time T(n) of this block of Pseudocode, taking standard
assumptions

 for (int i=0; i < n+1; i++)


{
sum = sum + i * A[i];
for (int j=0; j <= i; j++ )
{
sum = sum + j * A[n-j];
}
for (int k = 1; k < = n; k++)
{
sum = sum *i + A[k] * A[i];
}
}

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 38
Calculate Running Time of an Algorithm/Pseudocode

 Example 1: Find running time T(n) of this block of Pseudocode, taking standard assumptions
n
 for (int i=0; i < n+1; i++) 
//
{ i=0
sum = sum + i * A[i]; // 2 mathematical operations in the loop
i
for (int j=0; j <= i; j++ ) // 
{ j=0
sum = sum + j * A[n-j]; // 2 mathematical operations in the loop
} n
for (int k = 1; k < = n; k++) // 
{ k=1
sum = sum *i + A[k] * A[i]; // 3 math operations in the loop
}
}
copyright - Design and Analysis of Algorithms
Dr. Debopam acharya 39
Calculate Running Time of an Algorithm/Pseudocode

n i n
 T(n) =  ( 2 + ( 2 ) +  3 )
i=0 j=0 k=1

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 40
Calculate Running Time of an Algorithm/Pseudocode
 To solve for T(n), we simplify the series as follows:
n
T(n) =  ( 2 + 2(i+1) + 3n)
i=0
n
=  (4 + 2i + 3n)
i=0
n
= 4(n+1) + 2 i + 3n(n+1)
i=0

= 4n + 4 + 2(n * (n+1)/2) + 3n2 + 3n

T(n) = 4n2 + 8n + 4

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 41
Calculate Running Time of an Algorithm/Pseudocode

 Example 2: What is the complexity of the following fragment, where T(n) is the count of the number of
additions to the variable ‘sum’ in the fragment?

for (int i=0; i< n+1; i++) do


for (int j=0; j <= i; j++ ) do
sum = sum + j + mysub(j);
end-for(j)
end-for(i)

Given that complexity of the subroutine Tmysub(n) = n+3 for all cases (i.e. best-, average and worst-case).

 Solution: If the code fragment contains a function call, then the time of the call must be added in as well.

 Tmysub(n) = n+3, implies Tmysub(j) = j+3

 Hence, the running time of this block of code translates to

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 42
Calculate Running Time of an Algorithm/Pseudocode

n i
T(n) =   (3)
i=0 j=0

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 43
Calculate Running Time of an Algorithm/Pseudocode
Example 3:

Algo(A, n)
mxsum  0;
for i  1 to n
do for j = i to n
sum  0
for k  i to j
do sum += A[k]
mxsum  max(sum, mxsum)
return mxsum

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 44
Calculate Running Time of an Algorithm/Pseudocode
Example 3:

Algo(A, n)
mxsum  0;
n n j
T(n) =    1
for i  1 to n i=1 j=i k=i
do for j = i to n
sum  0
for k  i to j
do sum += A[k]
mxsum  max(sum, mxsum)
return mxsum

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 45
Estimating running time - by inspection
 Inspection approach and informal counting can also be used to determine the run time of an
algorithm. Some general rules are as follows:

 Consecutive statements
Maximum statement is the one counted
e.g. a fragment with single for-loop followed by double for-loop is O(n2).

 If/Else Statements

For fragment: if cond then


S1
else S2

running time is never more than the running time of the test plus the larger of the running
times of S1 and S2

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 46
We covered the following topics in this presentation:

 Growth of Functions
 Asymptotic Notations
 Running Time Calculations

copyright - Design and Analysis of Algorithms


Dr. Debopam acharya 47

You might also like