0% found this document useful (0 votes)
4 views32 pages

Asymptotic Notations

The document discusses algorithm complexity, focusing on time and space complexity, and introduces asymptotic notations including Big-oh, Omega, and Theta. It explains how to analyze algorithms using worst-case, best-case, and average-case scenarios, and provides examples of calculating frequency counts for various code structures. Additionally, it covers the concepts of little-oh and little-omega notations, as well as practical examples of time complexity calculations for different algorithms.

Uploaded by

surendrabis49
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views32 pages

Asymptotic Notations

The document discusses algorithm complexity, focusing on time and space complexity, and introduces asymptotic notations including Big-oh, Omega, and Theta. It explains how to analyze algorithms using worst-case, best-case, and average-case scenarios, and provides examples of calculating frequency counts for various code structures. Additionally, it covers the concepts of little-oh and little-omega notations, as well as practical examples of time complexity calculations for different algorithms.

Uploaded by

surendrabis49
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 32

Data Structures

UCS301

Asymptotic Notations

Department of CSE
Thapar Institute of Engineering and Technology, Patiala
Algorithm and its complexity

• It is the sequence of instruction which should be executed to


perform meaningful task.
• Efficiency or complexity of an algorithm is analyzed in terms
of
– cpu time and
– memory.

• Amount of computational
Time Complexity time required by an algorithm
Complexity of to perform complete task.
Algorithms
• Amount of memory required
Space Complexity
by an algorithm to complete
its execution.
3 cases to analyze an algorithm
1) Worst Case Analysis (Usually Done) –
– we calculate upper bound on running time of an algorithm.
– We must know the case that causes maximum number of
operations to be executed.

2) Best Case Analysis (Bogus)


- we calculate lower bound on running time of an algorithm.
- We must know the case that causes minimum number of
operations to be
executed.

3) Average Case Analysis (Sometimes done)


- we take all possible inputs and calculate computing time for all
of the inputs.
- Sum all the calculated values and divide the sum by total number
of inputs. We must know (or predict) distribution of cases.
Asymptotic Notation
• They are the mathematical tools to represent the time
complexity of algorithms for asymptotic analysis.
• Evaluate the performance of an algorithm in terms of input size
• The notation we use to describe the asymptotic running time of
an algorithm are defined in terms of function whose domain are
the set of natural number N = {0,1,2,…}
• It also describe the behaviour of time or space complexity for
large instance characteristics.
• Calculate, how the time (or space) taken by an algorithm
increases with the input size
• There are three Asymptotic Notations
 Big-oh (O) Omega(Ω) Theta(Θ)
Asymptotic Notation: Big-oh Notation (O)
• Asymptotic upper bound used for
worst-case analysis
• Let f(n) and g(n) are functions over
non-negative integers
• if there exists constants c and n0, such
that
• f(n) ≤ c g(n) for n ≥ n0
• Then we can write f(n) = O(g(n))
• Example f(n) = 2n + 1, g(n) = 3n
• i.e f(n) ≤ 3n
• so we can say f(n) = O(n) when c = 3
Example: Big-oh Notation (O)

• Consider f(n) = 2n+2, g(n) = n2.


• Find some constant c such that f(n) ≤ g(n)
• For n=1, f(n) = 2*1+2 = 4, g(n) = 1 - > f(n)>g(n)
• For n=2, f(n) = 2*2+2 = 6, g(n) = 4 -> f(n)>g(n)
• For n=3, f(n) = 2*3+2 = 8, g(n) = 9 -> f(n)<g(n)

• Thus for n > 2, f(n) < g(n) ->always an upper bound.


Example: Big-Oh
Show that: n2/2 – 3n = O(n2)
• Determine positive constants c1 and n0 such that
n2/2 – 3n ≤ c1n2 for all n ≥ n0
• Diving by n2
1/2 – 3/n ≤ c1
• For:
n = 1, 1/2 – 3/1 ≤ c1 (Holds for c1 ≥ 1/2)
n = 2, 1/2 – 3/2 ≤ c1 (Holds and so on…)

• The inequality holds for any n ≥ 1 and c1 ≥ 1/2.


• Thus by choosing the constant c1 = 1/2 and n0 = 1, one can verify
that n2/2 – 3n = O(n2) holds.

3
Asymptotic Notation: Omega Notation (Ω)
• Asymptotic lower bound used to describe
best-case running times
• Let f(n) and g(n) are functions over non-
negative integers
• if there exists constants c and n0, such
that
c g(n) ≤ f(n) for n ≥ n0
• Then we can write f(n) = Ω (g(n)),
• Example f(n) = 18n+9, g(n) =18n
i.e f(n) ≥ 18n
so we can say f(n) = Ω (n) when c = 18
Example: Big-Omega
Show that: n2/2 – 3n = Ω(n2)
• Determine positive constants c1 and n0 such that
c1n2 ≤ n2/2 – 3n for all n ≥ n0
• Diving by n2
c1 ≤ 1/2 – 3/n
• For: n = 1, c1 ≤ 1/2 – 3/1 (Not Holds)
n = 2, c1 ≤ 1/2 – 3/2 (Not Holds)
n = 3, c1 ≤ 1/2 – 3/3 (Not Holds)
n = 4, c1 ≤ 1/2 – 3/4 (Not Holds)
n = 5, c1 ≤ 1/2 – 3/5 (Not Holds)
n = 6, c1 ≤ 1/2 – 3/6 (Not Holds and Equals ZERO)
n = 7, c1 ≤ 1/2 – 3/7 or c1 ≤ (7-6)/14 or c1 ≤ 1/14 (Holds for c1 ≤ 1/14)
• The inequality holds for any n ≥ 7 and c1 ≤ 1/14.
• Thus by choosing the constant c1 = 1/14 and n0 = 7, one can verify that
n2/2 – 3n = Ω(n2) holds. 34
Asymptotic Notation: Omega Notation (Ω)

• Consider f(n) = 2n2 +5, g(n) = 7n. Find some constant c such
that f(n) ≥ g(n)
• For n=0, f(n) = 0+5=5, g(n) = 0 -> f(n)>g(n)

• For n=1, f(n) = 2*1*1+5 = 7, g(n) = 7 -> f(n) = g(n)

• For n=3, f(n) = 2*3*3+5 = 23, g(n) = 21 -> f(n) ≥ g(n)

• Thus for n > 3, f(n) ≥ g(n) ->always lower bound.


Asymptotic Notation: Theta Notation (Θ)
• Asymptotically tight bound used for average-case
analysis
• Let f(n) and g(n) are functions over non-negative
integers
• if there exists constants c1, c2, and n0, such that
c1 g(n) ≤ f(n) ≤ c2 g(n) for n ≥ n0
• f(n)=Θ(g(n)) if and only if f(n) = O(g(n)) and f(n) =
Ω(g(n))
• Example f(n) = 18n+9, c1 g(n) =18n, c2 g(n) =27n
f(n) > 18n and f(n) ≤ 27n
so we can say f(n) = O(n) and f(n) = Ω(n)
i.e f(n) = Θ(n)
Example: Theta
• Show that: n2/2 – 3n = θ(n2)
• Determine positive constants c1, c2, and n0 such that
c1n2 ≤ n2/2 – 3n ≤ c2n2 for all n ≥ n0
• Diving by n2
c1 ≤ 1/2 – 3/n ≤ c2

• Right Hand Side Inequality holds for any n ≥ 1 and c2 ≥ 1/2.

• Left Hand Side Inequality holds for any n ≥ 7 and c1 ≤ 1/14.


• Thus by choosing the constants c1 = 1/14 and c2 = 1/2
and n0 = 7, one can verify that n2/2 – 3n = θ(n2) holds.
35
o-Notation

• o-notation denotes an upper bound that is not asymptotically


tight.
• Formally o(g(n)) (“little-oh of g of n”) is defined as the set

• For example, 2n = o(n2), but 2n2 != o(n2).


• Intuitively, in o-notation, the function f(n) becomes insignificant
relative to g(n) as n approaches infinity; that is,

37
ω-Notation
• ω-notation denotes a lower bound that is not
asymptotically tight.
• One way to define it is as f(n) ∈ ω(g(n)) if and only if
g(n) ∈ o(f(n)).
• Formally, ω(g(n)) (“little-omega of g of n”) is defined as the set

• For example, n2/2 = ω(n), but n2/2 != ω(n2).


• The relation f(n) ∈ ω(g(n)) implies that limit , if
exists.
38
Precedence of the complexity
Frequency Count
• Complexity of algorithm is calculated using frequency count of each
statement.
• Frequency count defines the number of times that statement is
executed in the program with respect to input.
Construct code FC
Statement s i=0; 1
a =b+c;
If else If (a<b) Largest block statement counts
{ statements; }
Else
{ statements; }
For, while, do while for(i=1;i<=n;i++) n+1
{ }

i=1; n+1
While (i<=n)
{ }
n+1
Do { }
While(i<=n)
Frequency Count
Example -1
Code FC Reason
for(i=1; i<=n; i++) n+1 execute 1 to n times for true and 1 more
time for false
{ - ignore
x =x +1; n execute whenever you will enter into the
loop that means only for true condition.
So loop is true from 1 to n times
} ignore
Total F(n) = =n+1+n TC = O(n)
=2n+1
i = 1 -> x = x+1, i=i+1 -> i = 2
for(i=1;i<=5;i++) i = 2 -> x = x+1, i=i+1 -> i = 3
{ i = 3 -> x = x+1, i=i+1 -> i = 4
i = 4 -> x = x+1, i=i+1 -> i = 5
x =x+1
i = 5 -> x = x+1, i=i+1 -> i = 6
} i = 6 checks condition, exits for loop
Frequency Count
Example -2
Code FC Reason
for(i=1; i<=n; i++) n+1 execute 1 to n times for true and 1 more time for
false
{ - ignore
for(i=1; i<=m; i++) n( m+1) execute 1 to m times for true and 1 more time for
false i.e m+1 for every true condition of outer loop.
So outer loop gives n times true i.e n(m+1)
{ ignore
x =x +1; nm Inner loop is true m times per true iteration of outer
loop. So outer loop is true n times so nm times this
statement will execute
}} ignore
Total F(n) = = n+1+n(m+1)+mn TC = O(mn)
= n+1+nm+n+mn
= 2nm+2n+1
Example -3
A(n)
{
for(i=1;i<=n,i++) n
for(j=1;j<=n;j++) n
printf(“hi”);
}

Total Time Complexity – n*n = O(n2)


Example -4

for(i=1; i<=n; i++) -n


{
for(j=1; j<=n; j++) -n
{
for(k=1; k<=n; k++) -n
{
a = a * b +c;
}
}
}

Total Time complexity = O(n3 )


Example -5
A(n)
{
for(i=1;i<n;i=i*2)
printf(“hi”);
}

for the nth value 1 2 4 8 …………. n

Loop will run power of 2’s 20 21 22 23 ……….. 2k


times

2k = n k = log2n T.C. = O(log2n)


Example -6
A(n)
{
while(n>1)
{
n=n/2;
}
}

Assume n>=2

n= 2 4 8 …………. n
Loop will run 1 2 3 ……….. 2k

2k = n k = log2n T.C. = O(log2n)


Example -7
A(n)
{
int i,j,k;
for(i=n/2;i<=n,i++) n/2
for(j=1;j<=n/2;j++) n/2
for(k=1; k<=n; k=k*2) log2n
printf(“hi”);
}
Time Complexity = n/2 * n/2 * log2n
= O(n2log2n )
Example -8
A(n)
{
int i,j,k;
for(i=n/2;i<=n,i++) n/2
for(j=1;j<=n/2; j=2*j) log2n
for(k=1; k<=n; k=k*2) log2n
printf(“hi”);
}

Time Complexity = n/2 * log2n * log2n


= O(n ( log2n)2 )
Example -9
A(n)
{
int i,j,k,n;
for(i=1;i<=n,i++) i=1 i=2 3 n
for(j=1;j<=i2; j=j++) j=1 times j=4 times 9 n2
for(k=1; k<=n/2; k=k++) K= K = n/2*4 K = n/2*9 K = n/2*n2
printf(“hi”); n/2*1 times times times
} times

Total T.C. = n/2*1 + n/2*4 +n/2*9 +……..n/2*n2


= n/2(1+4+9+……n2)
= n/2(n(n+1)(2n+1)/6)
= O(n4)
Example 10
A[i] = A[0] + A[1] + … + A[i]
Algorithm arrayElementSum(A,N) Input: An array A containing N
integers.
Output: An updated array A containing N integers.

1
1. for i = 1 to N – 1 do
2. sum = 0
Option 2 is better
3. for j = 0 to i do
4. sum = sum + A[j] 2
5. A[i] = sum 1. for i = 1 to N – 1 do
2. A[i] = A[i] + A[i – 1]
Contd…
Cost Frequency
1. for i = 1 to N – 1 do c1 N

2. sum = 0 c2 N–1
3. for j = 0 to i do c3
4. sum = sum + A[j] c4
5. A[i] = sum c5 N–1
Contd… Cost Frequency
1. for i = 1 to N – 1 do c1 N

2. A[i] = A[i] + A[i – 1] c2 N–1


Example 11: Insertion Sort
6 3 9 1 8 temp = 3
6 3 9 1 8
6 3 9 1 8 6 6 9 1 8

3 6 9 1 8
temp = 1
3 6 9 1 8
3 6 9 1 8
3 36 6
9 9 8
1 3 6 9 8

temp = 8
1 3 6 8 9 1 3 6 9 8

1 3 6 9 9
Algorithm insertionSort(A, N)
Input: An array A containing N elements.
Output: The elements of A get sorted in increasing order.

1. for i = 1 to N – 1 c1 N
2. temp = A[i] c2 N–1
3. j=i c3 N–1
4. while j > 0 and a[j-1] > temp c4
5. a[j] = a[j-1] c5
6. j=j–1 c6

7. a[j] = temp c7 N–1


Contd…

Best case:

Worst case:

You might also like