Asymptotic Analysis and Growth of Functions
Asymptotic Analysis and Growth of Functions
• Growth of Functions
• Asymptotic Notations
• Running Time Calculations
speed of central processing unit, data bus type and other hardware
Design, programming, and testing time taken
Computer programming language and developer’s coding proficiency
Input of the program
Type of platform – desktop, web or mobile
Operating system type and version
1. Find T(n), the running time of an algorithm in terms of input n. It is either given or have
to be computed from the given block of code/pseudo code.
We will initially discuss step 2, that is, use T(n) to perform Asymptotic Analysis. Later, we
will discuss step 1, how to find T(n) from a given block of code/pseudo code
Hence, for an input variable n, we calculate the running time of an algorithm as a function
T(n).
Using different inputs, we observe the running time of our function T(n) and see if the graph
is crossing somewhere as compared to a standard function whose growth rate we already
know.
This is called the asymptotic growth rate of a function, that is, observe the growth rate of a
function as the value of n becomes very large, or approaches infinity.
Hence, with the help of simple standard functions, we can asymptotically compare, and
approximate a given function in hand.
By comparing the growth rate of our function and the standard function, we are able to
understand how best the standard function can approximate our function in hand.
So, we are able to estimate the running the running time of two functions by observing their
growth rate. This saves us time, as we do not have to write programs every time to implement
the functions.
3n2 +2n +1 = O(n2) as 3n2 +2n +1 <= 4n2 for all n>=3
4n+3 != O(1) as 4n+3 is never <= any constant c for all n>=n0
However, it doesn’t really show how good this upper bound is.
So, we when we say T(n) = O(F(n)) to be informative, to be really effective, F(n) should
be as small a function (tightly bound) of n that can be for which T(n) = O(F(n)).
However, if we say 4n+2=O(n2), this is a loose upper bound with a quadratic function,
even though our statement 4n+2=O(n2) is correct.
Lower order terms in a polynomial function can be ignored, and we only compare relative
growth.
This function is commonly used in programs that solves a big problem by transforming it into
a smaller problem by cutting the size by some constant fraction (recursive sorting, searching,
etc.).
Whenever value of n doubles, log n increases by a constant value, but log n does not double
until n becomes n2.
As the value of n increases, running time T(n) grows in the same proportion.
Doubling n more or less doubles the running time of the algorithm.
For larger values of m, a polynomial growth is practical only for small problems.
A function f(n) has constant running time if the size of the input n bears no effect
on the running time of the algorithm.
Then T(n)= OF(n) if there exist a natural number n0 and a +ve real constant c (which is
dependent on the value of n0) such that ∀ natural numbers n that are larger than n0,
T(n) >= c*F(n).
In simple terms, T(n)=θ(F(n)) means growth rate of T(n) equals growth rate of F(n)
Little o notation implies that function T's growth rate is strictly less than that of
function F
Example:
T(n) = log(n) and F(n) = n2
If T1 = O(F) and T2 = O(G) where all functions have positive real value, then
(a) T1 + T2 = max(O(F),O(G))
(b) T1 * T2 = O(F * G)
O(G(n)) denotes the set of all functions F(n) such that F(n) < c * G(n), for a
constant c > 0.
Hence, if we say “F(n) is O(G(n))“, this means F(n) is a member of the set of
functions O(G(n)).
In general, we can state statement such as, for all c > 0, there is a constant d > 0
such that n2 + cn < dn2, for all n > 0
4n3 + 2n2 + n
= 4n3 + 2n2 + O(n)
= 4n3 + θ( n2 + n)
= 4n3 + θ( n2 )
= θ(n3 )
Some other factors that affects performance are (a) algorithm used and (b) kind of
input to the algorithm
Parameter n, usually referring to number of data items for input, affects running
time most significantly.
1. Find T(n), the running time of an algorithm in terms of input n. It is either given or have
to be computed from the given block of code/pseudo code.
We have already discussed step 2, that is, using T(n) to perform Asymptotic Analysis. Now
we discuss step 1, how to find T(n) from a given block of code/pseudo code
An important step in this process is to review the summation series and relook at important
functions.
11
a
a
(a ) a
m n mn
m n
a a a
m n
lg(n!) = (n lg n)
Can be proven using Stirling’s approximation for lg(n!).
1 b a 1
i a
i 1
i 3
13
2 3
n 3
4
1 x
Example: Find running time T(n) of this block of Pseudocode, taking standard
assumptions
Example 1: Find running time T(n) of this block of Pseudocode, taking standard assumptions
n
for (int i=0; i < n+1; i++)
//
{ i=0
sum = sum + i * A[i]; // 2 mathematical operations in the loop
i
for (int j=0; j <= i; j++ ) //
{ j=0
sum = sum + j * A[n-j]; // 2 mathematical operations in the loop
} n
for (int k = 1; k < = n; k++) //
{ k=1
sum = sum *i + A[k] * A[i]; // 3 math operations in the loop
}
}
copyright - Design and Analysis of Algorithms
Dr. Debopam acharya 39
Calculate Running Time of an Algorithm/Pseudocode
n i n
T(n) = ( 2 + ( 2 ) + 3 )
i=0 j=0 k=1
T(n) = 4n2 + 8n + 4
Example 2: What is the complexity of the following fragment, where T(n) is the count of the number of
additions to the variable ‘sum’ in the fragment?
Given that complexity of the subroutine Tmysub(n) = n+3 for all cases (i.e. best-, average and worst-case).
Solution: If the code fragment contains a function call, then the time of the call must be added in as well.
n i
T(n) = (3)
i=0 j=0
Algo(A, n)
mxsum 0;
for i 1 to n
do for j = i to n
sum 0
for k i to j
do sum += A[k]
mxsum max(sum, mxsum)
return mxsum
Algo(A, n)
mxsum 0;
n n j
T(n) = 1
for i 1 to n i=1 j=i k=i
do for j = i to n
sum 0
for k i to j
do sum += A[k]
mxsum max(sum, mxsum)
return mxsum
Consecutive statements
Maximum statement is the one counted
e.g. a fragment with single for-loop followed by double for-loop is O(n2).
If/Else Statements
running time is never more than the running time of the test plus the larger of the running
times of S1 and S2
Growth of Functions
Asymptotic Notations
Running Time Calculations