0% found this document useful (0 votes)
16 views44 pages

Asymptotic Notations and Complexity Analysis

The document provides an overview of asymptotic notations and complexity analysis in algorithms, covering basic terminology, types of algorithm complexities (worst, average, best case), and various asymptotic notations such as Big-Oh, Big-Omega, and Big-Theta. It includes specific examples and analyses of algorithms like Insertion Sort, highlighting time-space tradeoffs and the efficiency of different algorithmic structures. Additionally, it poses review questions to reinforce understanding of the material presented.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views44 pages

Asymptotic Notations and Complexity Analysis

The document provides an overview of asymptotic notations and complexity analysis in algorithms, covering basic terminology, types of algorithm complexities (worst, average, best case), and various asymptotic notations such as Big-Oh, Big-Omega, and Big-Theta. It includes specific examples and analyses of algorithms like Insertion Sort, highlighting time-space tradeoffs and the efficiency of different algorithmic structures. Additionally, it poses review questions to reinforce understanding of the material presented.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 44

Data Structures

Lecture: Asymptotic Notations & Complexity Analysis

By
Amit Sharma
Asst. Professor,
Lovely Professional University, Punjab
Contents
• Basic Terminology
• Complexity of Algorithm
• Asymptotic Notations
• Review Questions
Basic Terminology
• Algorithm: is a finite step by step list of well-defined
instructions for solving a particular problem.

• Complexity of Algorithm: is a function which gives running


time and/or space requirement in terms of the input size.

• Time and Space are two major measures of efficiency of an


algorithm.
Algorithm

Specification of Output
Specification of Input
(e.g. any sequence of natural Algorithm as a function of Input
numbers) (e.g. sequence of sorted
natural numbers)
Characteristics of Good Algorithm
• Efficient
• Running Time
• Space used

• Efficiency as a function of input size


• Size of Input
• Number of Data elements
Time-Space Tradeoff
• By increasing the amount of space for storing the
data, one may be able to reduce the time needed for
processing the data, or vice versa.
Complexity of Algorithm
• Time and Space used by the algorithm are two main
measures for efficiency of any algorithm M.

• Time is measured by counting the number of key


operations.

• Space is measured by counting the maximum of


memory needed by the algorithm.
• Complexity of Algorithm is a function f(n) which gives
running time and/or space requirement of algorithm M in terms
of the size n of the input data.

• Worst Case: The maximum value of f(n) for any possible input.

• Average Case: The expected or average value of f(n).

• Best Case: Minimum possible value of f(n).


Analysis of Insertion Sort Algorithm
cost times
for j←2 to n do c1 n
key ←A[j] c2 n-1
i ←j-1 c3 n-1
while i>0 and A[i] > key c4
do A[i+1] ←A[i] c5
i-- c6
A[i+1] ← key c7 n-1
Total Time = n(c1 + c2 + c3 + c7) +
- (c2 + c3 + c5 + c6 + c7)
Analysis of Insertion Sort
Total Time = n(c1 + c2 + c3 + c7) +
- (c2 + c3 + c5 + c6 + c7)
• Best Case: Elements are already sorted, tj=1
running time = f(n)
• Worst Case: Elements are sorted in reverse order,
tj=j
running time = f(n2)
• Average Case: tj= j/2
running time = f(n2)
Rate of Growth
• The rate of growth of some standard functions
g(n) is:

log2n < n < nlog2n < n2 < n3 < 2n


Asymptotic Notations
• Goal: to simplify analysis of running time .

• Useful to identify how the running time of an


algorithm increases with the size of the input in
the limit.

• Asymptotic is a line that approaches a curve but


never touches.
Asymptotic Notations
Special Classes of Algorithms
• Logarithmic: O(log n)
• Linear: O(n)
• Quadratic: O(n2)
• Polynomial: O(nk), k >= 1
• Exponential: O(an), a > 1
Big-Oh (O) Notation
• Asymptotic upper bound

• f(n) = O (g(n)), if there exists


constants c and n0 such that,

• f(n) <= c g(n) for n >= n0

• f(n) and g(n) are functions over non-


negative integers.
• Used for Worst-case analysis.
Big-Oh (O) Notation
• Simple Rule:
Drop lower order terms and constant factors.

Example:
• 50n log n is O(n log n)
• 8n2 log n + 5 n2 + n is O(n2 log n)
Big-Omega (Ω) Notation
• Asymptotic lower bound

• f(n) = Ω (g(n)), if there


exists constants c and n0
such that,
c g(n) <= f(n) for n >= n0

• Used to describe Best-case


running time.
Big-Theta (Ө)Notation
• Asymptotic tight bound

• f(n) = Ө (g(n)), if there exists


constants c1, c2 and n0 such that,

• c1 g(n) <= f(n) <= c2 g(n) for n


>= n0

• f(n) = Ө (g(n)), iff f(n) = O(g(n)) and


f(n) = Ω (g(n))
Little-Oh (o) Notation
• Non-tight analogue of Big-Oh.

• f(n) = o (g(n)), if for every c, there exists n0


such that,
f(n) < c g(n) for n >= n0

• Used for comparisons of running times.


Analysis of Algorithms
• // Here c is a constant
for (int i = 1; i <= c; i++)
{
// some O(1) expressions
}
Reviewing Complexity

#include <iostream>
int main(){
int c = 4;
cin>>n;
int d = c*5;
}
Reviewing Complexity

#include <iostream>
int main(){
int c = 4;  1
cin>>n;  1
int d = c*5;  1

}
O(1+1+1) = constant time
Reviewing Complexity

for(int i =0; i<n ;i++){


int c = c+n;
}
Reviewing Complexity

for(int i =1; i<n ;i+=2){


stmt;
}
Reviewing Complexity

for(int i =0; i<n ;i++){


for(int j = 0; j<n;j++){
int d = d*6;
}
}
Reviewing Complexity

int a = 0, b = 0;
for (i = 0; i < N; i++) {
a = a + rand();
}
for (j = 0; j < M; j++) {
b = b + rand();
}
int a = 0; 1
int b = 0; 1
for (i = 0; i < N; i++) {  n+1
a = a + rand(); n
}
for (j = 0; j < M; j++) { m+1
b = b + rand(); m
}
total steps : 4+2n+2m
=2(n+m) +4
ignore constant ->O(N+M)
Reviewing Complexity

for(int i =0; i<n ;i++){


for(int j=0;j<i;j++)
stmt;
}
}
Reviewing Complexity
int a = 0;
for (i = 0; i < N; i++) {
for (j = N; j > i; j--) {
a = a + i + j;
}
}
Reviewing Complexity

--
p=0;
for(i=1; p<=n;i++){
p = p+i;
}
Reviewing Complexity

for (int i = 1; i < n; i++) {


i *= k;
}
Answer 12
int i=1;
while (i<n) {
i *= k;
}
Assume n =20 k=4
i=1 = 4 0
1<8
i=4*1 = 41 4<8
i=4*1*4 =4 2
16<8
.
.
After m iteration
i = 4m
km>n

taking log both sides

mlogk<logn

m<logn/logk

m<logkn
Analysis of Algorithms
• for (int i = 1; i <= n; i += c)
{ // some O(1) expressions }

• for (int i = n; i > 0; i -= c)


{ // some O(1) expressions }

• O(n): Time Complexity of a loop is considered as


O(n) if the loop variables is incremented /
decremented by a constant amount.
Analysis of Algorithms
• for (int i = 1; i <=n; i += c)
{
for (int j = 1; j <=n; j += c)
{ // some O(1) expressions
}
}
• for (int i = n; i > 0; i -= c)
{
for (int j = i+1; j <=n; j += c)
{ // some O(1) expressions
} }
• O(n2): Time complexity of nested loops is equal to the
number of times the innermost statement is executed.
Analysis of Algorithms
• for (int i = 1; i <=n; i *= c)
{ // some O(1) expressions }

• for (int i = n; i > 0; i /= c)


{ // some O(1) expressions }

• O(Logn) Time Complexity of a loop is


considered as O(Logn) if the loop variables is
divided / multiplied by a constant amount.
Analysis of Algorithms
• for (int i = 2; i <=n; i = pow(i, c))
{ // some O(1) expressions }

• //Here fun is sqrt or cuberoot or any other


constant root
for (int i = n; i > 0; i = fun(i))
{ // some O(1) expressions }
• O(LogLogn) Time Complexity of a loop is
considered as O(LogLogn) if the loop variables is
reduced / increased exponentially by a constant
amount.
Analysis of Algorithms
• for (int i = 2; i*i <=n; i++))
{ // some O(1) expressions }

• O(√n) Time Complexity.


• for(i=0;i<=100;i+=2)
statement block;

Here, the number of iterations is half the


number of the loop factor. So, here the
efficiency can be given as f(n) = n/2
for(i=1;i<1000;i*=2) for(i=1000;i>=1;i/=2)
statement block; statement block;

when n = 1000, the number of iterations can be


given by log 1000. Therefore, putting this analysis
in general terms, we can conclude that the
efficiency of loops in which iterations divide or
multiply the loop-controlling variables can be
given as f(n) = log n
Linear logarithmic loop
for(i=0;i<10;i++)
for(j=1; j<10;j*=2)
statement block;

the efficiency of such loops can be given as


f(n) = n log n.
• Dependent quadratic loop
for(i=0;i<10;i++)
for(j=0; j<=i;j++)
statement block;

The efficiency of such a code can be given as


f(n) = n (n + 1)/2
Questions
Review Questions
• When an algorithm is said to be better than the
other?

• Can an algorithm have different running times on


different machines?

• How the algorithm’ running time is dependent on


machines on which it is executed?
Review Questions
Find out the complexity:
function ()
{
if (condition)
{
for (i=0; i<n; i++) { // simple statements}
}
else
{
for (j=1; j<n; j++)
for (k=n; k>0; k--) {// simple statement}
}
}

You might also like