0% found this document useful (0 votes)
72 views

Assignment No 1......

The document discusses asymptotic notations that are used to analyze algorithms as the input size increases. It describes the main types of asymptotic notations - Big O, Big Omega, Big Theta, little o, and little omega - and provides their mathematical definitions and graphical examples. It also discusses best case, worst case, and average case time complexities, and uses searching an array as an example to illustrate analyzing algorithms based on these cases.
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
72 views

Assignment No 1......

The document discusses asymptotic notations that are used to analyze algorithms as the input size increases. It describes the main types of asymptotic notations - Big O, Big Omega, Big Theta, little o, and little omega - and provides their mathematical definitions and graphical examples. It also discusses best case, worst case, and average case time complexities, and uses searching an array as an example to illustrate analyzing algorithms based on these cases.
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 9

Assignment no 1

Submitted to :
Myym Sidra
Submitted by :
Ayesha Tahir
Roll no :
2293
Subject :
Data Structure and Algorithm
Topic:
Asymptatic Notations cases and types
Section:
Morning B
Session :
2022-2026

Department of Computer Science


GCWUF

1
Asymptotic Notations
Asymptotic notations give us an idea about how good a given algorithim is
compared to some other algorithim.

Let us see the non theoratical definition of “order of” now .

Primarily there are Five types of widely used asymptotic notations .

1. Big oh notations
2. Big omega notations
3. Big theta notations
4. Little oh
5. Small omega

Big oh notations :
Big oh notations is used to describe asymptotic notations upper
bound

Mathematically if f(n) derivatives running time of an algorethim ;


f(n) is o(gn) if there exists positive consants C and no such that

0 ≤ f(n) ≤ cg(n) for all n ≥ n not

If a function is o(n) , it is automatically o (n²) as well

2
Graphic examples for Big oh (o)

Big omega Notations :


Just like o notation provides an asymptotic upper bound . Bing
omega notation provides asymptotic notations bound. Let f(n) define
running time for an algorethim .

F(n) is said to be Ω (gx) if there exists positive constants C and


n. such that

0 ≤ c(gn) ≤ f(n) for all n ≥ n not

If a function is O(n²) it is automatically O(n) as well

3
Graphic example of Big omega

Big Theta notation :


Let f(x) define running time of an algorithim
F(x) is said to be O(gn) if f(x) is O(gx) and f(x) is Ω (gx)
Mathematically

0 ≤ c(gn) ≤ f(n)
0 ≤ f(n)≤ c(gn)
Merging both the equation we get

0 ≤ c(gn) ≤ f(n) ≤ c(gn)


The equation simply means there exists positive constants c1
and C2 such that f(x) is sandwiched between c2gn and c1gn

4
Graphic examples of big theta

Little o Notations
There are some other notations present except the Big-Oh,
Big-Omega and Big-Theta notations. The little o notation is one
of them.

Little o notation is used to describe an upper bound that cannot


be tight. In other words, loose upper bound of f(n).

Let f(n) and g(n) are the functions that map positive real
numbers. We can say that the function f(n) is o(g(n)) if for any
real positive constant c, there exists an integer constant n0 ≤ 1
such that f(n) > 0.

Example on little o asymptotic notation


If f(n) = n2 and g(n) = n3 then check whether f(n) = o(g(n)) or not.

5
Little ω asymptotic notation
Definition : Let f(n) and g(n) be functions that map positive integers to
positive real numbers. We say that f(n) is ω(g(n)) (or f(n) ∈ ω(g(n))) if for any
real constant c > 0, there exists an integer constant n0 ≥ 1 such that f(n) > c *
g(n) ≥ 0 for every integer n ≥ n0.
f(n) has a higher growth rate than g(n) so main difference between Big Omega
(Ω) and little omega (ω) lies in their definitions.In the case of Big Omega
f(n)=Ω(g(n)) and the bound is 0<=cg(n)<=f(n), but in case of little omega, it is
true for 0<=c*g(n)<f(n).

The relationship between Big Omega (Ω) and Little Omega (ω) is similar to
that of Big-Ο and Little o except that now we are looking at the lower bounds.
Little Omega (ω) is a rough estimate of the order of the growth whereas Big
Omega (Ω) may represent exact order of growth. We use ω notation to denote
a lower bound that is not asymptotically tight. And, f(n) ∈ ω(g(n)) if and only
if g(n) ∈ ο((f(n))

In mathematical relation,

if f(n) ∈ ω(g(n)) then,

lim f(n)/g(n) = ∞
n→∞
Example:

6
Prove that 4n + 6 ∈ ω(1);
the little omega(ο) running time can be proven by applying limit formula
given below.

if lim f(n)/g(n) = ∞ then functions f(n) is ω(g(n))

n→∞

here,we have functions f(n)=4n+6 and g(n)=1

Cases
1. Best Case
2. Worst Case
3. Average Case

Sometimes we get lucky in life Exams cancelled when you were not
prepared, surprise test when prepared et → Best case you Same
times we get unlucky. Questions you never prepared asked in exams,
vain during sports period etc. But overall the life remains balance
with the mixture of lucky and unlucky times. => Analysis of a search
algorithm Expected Case

Consider

17 18 1 28 25 280

"An array which is seated in increasing order

We have to search a given nurder in this array and report whether its present
array in the or not

7
Algo 1 → Start from first element until an element greater than or equal to the
number to be searched is found.

Algo 2 → Check whether the first or the last element is equal to the number. If
not find the number Between these two elements (center of the array) If the
center element is greater than the number to be searched, repeat the process
fork first half else repeat for second half until the number is found.

Analyzing Algo 1 If we really get lucky, the first element of the array
might turn out to be the clement we are searching for Hence we
made just one Comparison

Best Case Complexity = O(1)

If we are really unlucky, the element we are Searching for might be


the last one.

Worst case complexity = O(n)for calculating Average lase time, we


sum the list of all the possible lace's rentine and divide it with the
total number of cases.

Sometimes calculation of carry cave time gets very complicated

Analyzing Algo 2 If we get really lucky, the first element will be the
only one which gets compared

Best Case Complexity = O(1)

we get unlucky we will have to keep durding the array into halves
until we get a singe element (the array gets finished)

Wonst case complexly = O(logn)

8
What log(n)? What is that

loger) →→ Number of times you need to half the array of size n


before it get's exhausted

log 8 = 3 8/2 4/2 2/2

1 + 1 + 1

log4=2 4/2 2/2

1 + 1

Logn simply means how many time I need to divide In units such that ur cannot
divide them (into. halves) anymor

Space Complexity

Time is not the only thing we worry about while analyzing algorithms Space is
equally important

Creating on array of size n → O(*) space Le size of input

You might also like