0% found this document useful (0 votes)
14 views26 pages

Data Structures & Algorithms - Topic 7 - Asymptotic Notations & Order of Growth

Uploaded by

arsalanbaig099
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views26 pages

Data Structures & Algorithms - Topic 7 - Asymptotic Notations & Order of Growth

Uploaded by

arsalanbaig099
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 26

DATA STRUCTURES AND

ALGORITHMS

Asymptotic Notations
&
Order of Growth
Time Complexity of Algorithms
■ When we talk about time complexity of algorithms,
what do we actually mean?

■ It is not the actual running time. It is not the standard.


– It varies from machine to machine & compiler to compiler.
– It even varies on one machine due to availability of
resources.
Time Complexity of Algorithms
■ We actually calculate the estimation, not actual
running time.
■ Complexity can be viewed as the maximum number of
primitive operations that a program may execute.
■ Regular operations are:
– Addition
– Multiplication
– Assignment
– Accessing array element, etc.
■ We may leave some operations uncounted and
concentrate on those that are performed the largest
Time Complexity of Algorithms
■ We define complexity as a numerical function T(n) -
time versus the input size n.

■ We want to define time taken by an algorithm without


depending on the implementation details.

■ Talgorithm#1(n) = n(n-1) = n2 - n //quadratic time

■ Talgorithm#2(n) = 2(n-1) = 2n – 2 //linear time


Asymptotic Complexity
■ The resulting function gives only
an approximate measure of
efficiency of the original function.

■ However, this approximation is


sufficiently close to the original.

■ This measure of efficiency is


called asymptotic complexity.
Elimination of Lower Order Terms
■ Any terms that do not substantially change the
function’s magnitude should be eliminated from the
function.

■ Consider the following quadratic function:

■ For small values of n, the last term, 1,000, is the


largest. But we calculate time complexity to see the
behavior on large input sets.
Elimination of Lower Order Terms
≈n2
Asymptotic Notations
■ Asymptotic notations are mathematical tools to
represent the complexity of algorithms for asymptotic
analysis.

1. Big-Oh (O) Notation


2. Big-Omega (Ω) Notation
3. (Big) Theta () Notation
4. Small-Oh (o) Notation
5. Small-Omega (ω) Notation
Big-Oh (O) Notation
■ The Big O notation defines an upper bound of an
algorithm.

■ It bounds a function only from above.

■ It is the lowest upper bound.

■ It specifically describes the worst case scenario.


f(n)c.g(n)
Big-Oh (O) Notation for all nN0
& c>0, n1
Let f(n) is the function of
growth of time for some
input n.
Then there is a function
g(n) such that c.g(n) is the
lowest upper bound of f(n).
In other words, after some
value N0, the value of
c.g(n) will always be
greater than or equal to
g(n)
f(n)c.g(n)
Example for all nN0
& c>0, n1
■ Let f(n)=3n+2 and g(n)=n. Can we say that
f(n)=O(g(n))?

■ Solution:
f(n)  c.g(n)
 3n+2  c.n
Now, c can be any number greater
Than or equal to 4.
 3n+2  4.n
Big-Omega (Ω) Notation
■ The Big Ω notation defines a lower bound of an
algorithm.

■ It bounds a function only from below.

■ It is the greatest lower bound.

■ It specifically describes the best case scenario.


f(n)c.g(n)
Big-Omega (Ω) Notation for all nN0
& c>0, n1
Let f(n) is the function of
growth of time for some
input n.
Then there is a function
g(n) such that c.g(n) is the
greatest lower bound of
f(n).
In other words, after some
value N0, the value of c.g(n)
will always be less than or
equal to g(n)
f(n)c.g(n)
Example for all nN0
& c>0, n1
■ Let f(n)=3n+2 and g(n)=n. Can we say that
f(n)=Ω(g(n))?

■ Solution:
f(n)  c.g(n)
 3n+ 2  c.n
Now, c can be any number smaller
than or equal to 3.
 3n+2  3.n
(Big)-Theta () Notation
■ The Theta  notation defines a both upper and lower
bounds of an algorithm.

■ It bounds a function both from above and below.

■ It is the also called as tight bound.

■ It is used when we have the same function as lower as


well as upper bound.
c1.g(n)f(n)c2.g(n)
(Big)-Theta () Notationfor all nN0
& c1,c2>0, n1
Let f(n) is the function of growth C2 g(n)
of time for some input n.
Then there is a function g(n)
such that c1.g(n) is the greatest
lower bound and c2.g(n) is the
lowest upper bound of f(n).
C1 g(n)
In other words, after some
value N0, the value of c1.g(n)
will always be less than or
equal to f(n) and the value of
c2.g(n) will always be greater
than or equal to f(n).
c1.g(n)f(n)c2.g(n)
Example for all nN0
& c1,c2>0, n1
■ Let f(n)=3n+2 and g(n)=n. Can we say that
f(n)=(g(n))?

■ Solution:
c1.g(n)  f(n)  c2.g(n)
From previous two examples, it is
obvious that:
3n  3n+2  4n
where N0=2.
Small-Oh (o) Notation
■ The small O notation also defines an upper bound of
an algorithm.

■ It bounds a function only from above.

■ It is the just the upper bound.

■ Big-Oh is the least upper bound while it is just any


upper bound.
Small-Omega (ω) Notation
■ The small ω notation also defines a lower bound of an
algorithm.

■ It bounds a function only from below.

■ It is the just the lower bound.

■ Big-Omega is the greatest lower bound while it is just


any lower bound.
Asymptotic Notations
time

Ω(n)
ω(n)

ω(n)
Growth of functions
Usually growth of functions
can be categorized into the
following:
– Constant time
– Logarithmic time
– Linear time
– N-logarithmic time
– Polynomial time
– Exponential time
– Factorial time
Remarks
■ Most of the time, we are concerned with the
worst case complexity of algorithms.

■ It means we want to rate the algorithm in the


appropriate category, such that on any input,
what will be the maximum time required to
transform input to output.

■ So, most of the time, we will use the Big-Oh


notation.
Input Cases & Analysis Cases
■ There are three types of input cases.
– Best Case
– Worst Case
– Arbitrary/Random Case

■ There are three types of analysis cases.


– Best Case Analysis
– Worst Case Analysis
– Average Case Analysis
Example
■ Lets take the example of Linear Sequential Search.
■ It scans the array elements one by one and searches for
the required value.
■ If there are n elements in the array, and key is the value
to be searched, then its algorithm may be roughly
refined as:
– found=false
– For i = 1 to n
– If Array[i]=key
– found=true & stop;
– End if
– End for
Example
num_1

■ There are three types of input cases.


num_2
– Best Case key=num_1
– Worst Case key=num_n or key not
found num_3
– Arbitrary/Random Case key=any number
num_4

■ There are three types of analysis cases.


– Best Case Analysis 1 =Ω(1)
– Worst Case Analysis n =O(n)
– Average Case Analysis ≈ n/2+1 =O(n) num_n
End of Lecture

THANK YOU

You might also like