Asymptotic Notations Notes
Asymptotic Notations Notes
• If
a) the size of the loop is n (loop variable runs
from 0, or some fixed constant, to n) and
b) the body has constant running time (no
nested loops) then the time is O(n)
Rule 2: Nested loops – the product of the size of the
loops times the running time of the body
• Here, the number of the times the inner loop is executed depends on
the value of i
• i = 0, inner loop runs n times
• i = 1, inner loop runs (n-1) times
• i = 2, inner loop runs (n-2) times
• …
• i = n – 2, inner loop runs 2 times
• i = n – 1, inner loop runs once.
• Thus we get: ( 1 + 2 + … + n) = n*(n+1)/2 = O(n2)
General rule for nested loops:
• Running time is the product of the size of the loops times the
running time of the body.
Example:
sum = 0;
for( i = 0; i < n; i++)
for( j = 0; j < 2n; j++)
sum++;
• We have one operation inside the loops, and the product of the
sizes is 2n2
• Hence the running time is O(2n2) = O(n2)
• Note: if the body contains a function call, its running time has
to be taken into consideration
sum = 0;
for( i = 0; i < n; i++)
for( j = 0; j < n; j++)
sum = sum + function(sum);
• Assume that the running time of function(sum) is known to be
log(n).
• Then the total running time will be O(n2*log(n))
Rule 3: Consecutive program fragments
• The total running time is the maximum of the running time of
the individual fragments
sum = 0;
for( i = 0; i < n; i++)
sum = sum + i;
sum = 0;
for( i = 0; i < n; i++)
for( j = 0; j < 2*n; j++)
sum++;
• The first loop runs in O(n) time, the second - O(n2) time, the
maximum is O(n2)
Rule 4: If statement
• if C
S1;
else
S2;
• The running time is the maximum of the running times of S1
and S2.
Summary
• Steps in analysis of non-recursive algorithms:
• Decide on parameter n indicating input size
• Identify algorithm’s basic operation
• Check whether the number of time the basic operation is
executed depends on some additional
• property of the input. If so, determine worst, average, and best
case for input of size n
• Count the number of operations using the rules above.
ASYMPTOTIC NOTATIONS
• The main idea of asymptotic analysis is to have a measure of
efficiency of algorithms that doesn’t depend on machine
specific constants, and doesn’t require algorithms to be
implemented and time taken by programs to be compared.
• Asymptotic notations are mathematical tools to represent time
complexity of algorithms for asymptotic analysis.
• The following 3 asymptotic notations are mostly used to
represent time complexity of algorithms.
Big O Notation:
• The Big O notation defines an upper bound of an algorithm, it
bounds a function only from above. For example,
• consider the case of Insertion Sort. It takes linear time in best
case and quadratic time in worst case.
• We can safely say that the time complexity of Insertion sort is
O(n3). Note that O(n2) also covers linear time.
• If we use Θ notation to represent time complexity of Insertion
sort, we have to use two statements for best and worst cases:
1.The worst case time complexity of Insertion Sort is Θ(n 3).
2. The best case time complexity of Insertion Sort is Θ(n2).
• The Big O notation is useful when we only have upper bound
on time complexity of an algorithm.
• Many times we easily find an upper bound by simply looking
at the algorithm.
• For example, consider the following expression. logn = O (n)
• Let f(n)= logn , g(n)= n
When,
• n=1 f(n)=0 g(n)=1
• n=10 f(n)=3.3 g(n)=10
• N=100 f(n)=6.64 g(n)=100
• N=1000 f(n)=10 g(n)=1000
Hence,
• 0 <= f(n) <= c *g(n) for all n>=n0
• f(n) =O(g(n))
Ω Notation:
• Just as Big O notation provides an asymptotic upper bound on
a function, Ω notation provides an asymptotic lower bound.
• Ω Notation can be useful when we have lower bound on time
complexity of an algorithm.
• As discussed in the previous post, the best case performance of
an algorithm is generally not useful;
• the Omega notation is the least used notation among all three.
• For example, consider the following expression. O (n)= logn
• Let f(n)= n , g(n)= logn
• When,
• n=1 f(n)=1 g(n)=0
• n=10 f(n)=10 g(n)=3.3
• N=100 f(n)=100 g(n)=6.64
• N=1000 f(n)=1000 g(n)=10
Hence,
• 0 <= c *g(n) <= f(n) for all n>=n0
• f(n) = Ω(g(n))
Θ Notation:
• The theta notation bounds a function from above and below, so
it defines exact asymptotic behavior.
• A simple way to get Theta notation of an expression is to drop
low order terms and ignore leading constants.
• For example, consider the following expression. 3n 3 + 6n2 +
6000 = Θ (n3)
Hence,
• 0 <=c1*g(n) <= f(n) <= c2 *g(n) for all n>=n0
• f(n) =Θ(g(n))
Properties of Big-O Notation
• Fact 1 (Transitivity)
– If f(n)=O(g(n)) and g(n)=O(h(n)), then f(n)=O(h(n))
– Prove it (done in class)
• Fact 2
– If f(n)=O(h(n)) and g(n)=O(y(n)), then
f(n)+g(n)=max(O(h(n),O(y(n)) )
• Fact 5
– If f(n)=cg(n), then f(n)=O(g(n))
Rate of Growth functions
Exercises
1. Arrange the functions according to their order of growth
O(n),O(2n), O(n2) , O(log n), O(n3) ,O(n log n), O(1)
2. True or False
3n2 log n= Θ(n2)
3.Prove inequalities
3n3 + 6n2 + 6000 = Θ (n3)
Exercise
Find Time Complexity of the algorithms
1. for (i=0 to n-1) do
For (j=i+1 to n-1)
If(A[i]==A[j])
Return false
2. Matrix Multiplication
3. for(i=0;i<n;i=i+2)
Sum=sum+I;
4. While(n>1)
{ count=count+1;
n=n/2;
}
return (count)
Recursion
• Any function calls itself is recursive function
• Recursion is used for problems when a large task can be broken
down into a repetitive subtask.
Base Case
• The routine will come across a subtask that it can handle without
calling itself known as a base case. It stops the recursion.
Recursive Case
• The routine has to call itself to complete its subtask.
Advantages
• More elegant and requires few variables which makes program
clean.
• Replaces complex nesting code.
Disadvantages
• Harder to think logic of a recursive function
• Debugging is difficult
• Requires more memory
Example
Factorial (n)
{
If (n=0)
Return(1)
Else
Return (n*fact(n-1))
}
Recurrence Equation
• An equation that defines a sequence recursively.
• It is normally in following form
T(n)=T(n-1) + n if n>0
T(0)=0
Deriving Recurrence Equation
Factorial (n)
{
If (n=0)
Return(1)
Else
Return (n*fact(n-1))
}
• Basic Operation is Multiplication
• M(n)= M(n-1)+1 if n>0
• M(0)=0
Binary Search
// initially called with low = 0, high = N – 1
BinarySearch_Right(A[0..N-1], value, low, high) {
if (high < low)
return low
else
{
mid = low +((high – low) / 2)
if (A[mid] == value)
return mid
else if (A[mid] > value)
return BinarySearch_Right(A, value, low, mid-1)
else
return BinarySearch_Right(A, value, mid+1, high)
}
}
Recurrence Equation
Basic Operation is comparision
C(n)= C(n/2)+1 if n>1
C(1)=1