0% found this document useful (0 votes)
14 views5 pages

Lecture-4 IT-303 Complexity Analysis-Compressed

It os my personal lecture notes on complexity analysis

Uploaded by

annusinghmonii
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views5 pages

Lecture-4 IT-303 Complexity Analysis-Compressed

It os my personal lecture notes on complexity analysis

Uploaded by

annusinghmonii
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Topic- Complexity Analysis

Subject Name/Code- Data Structure/IT 303


Department of IT
Complexity of an Algorithm
1. It is a function describing the efficiency of the algorithm.
2. The efficiency of an algorithm is mainly defined by two factors i.e. space and time.
3. A good algorithm is one that is taking less time and less space.
4. There is a trade-off between time and space.

A. Space Complexity
• Space Complexity of an algorithm denotes the total space used or needed by the
algorithm for its working, for various input sizes.
• when you are creating a variable then you need some space for your algorithm to
run.
• All the space required for the algorithm is collectively called the Space
Complexity of the algorithm.
B. Time Complexity
• The time complexity is the number of operations an algorithm performs to
complete its task with respect to input size
• The algorithm that performs the task in the smallest number of operations is
considered the most efficient one.
• It is represented by Big(oh) notation means maximum time taken by the program.
Asymptotic Notations
• The efficiency of Algorithm is measured with the help of asymptotic notations.
• Before designing and Analysis of Algorithm, We must understand the terms of
Algorithm call Asymptotic Notations
• Asymptotic notations are the mathematical notations used to describe the running
time of an algorithm
1. Big(oh) Notation:

Big-O notation represents the upper bound of the running time of an algorithm. Thus, it
gives the worst case complexity of an algorithm.

• Worst Case

• Upper Bound

• At most

• Least Upper Most

O(g(n)) = { f(n): there exist positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n) for all
n ≥ n0 , c>0, n0>=0}
For any value of n, the running time of an algorithm does not cross time provided
by O(g(n)).
As n increases, f(n) grows no faster than g(n). In other words, g(n) is an asymptotic upper
bound on f(n).

2. Omega Notation (Ω-notation)


Omega notation represents the lower bound of the running time of an algorithm. Thus, it
provides best case complexity of an algorithm.
• Best Case
• Lower Bound
• At least
• Greatest Lower Bound

Ω(g(n)) = { f(n): there exist positive constants c and n0 such that 0 ≤ cg(n) ≤f(n) for all
n ≥ n0 }
For any value of n, the minimum time required by the algorithm is given by
Omega Ω(g(n))

3. Theta Notation (Θ-notation)


It represents the upper and the lower bound of the running time of an algorithm, it is used
for analyzing the average case complexity of an algorithm.
• Average Case
• Exact Time
• Tight Bound
Θ(g(n)) = { f(n): there exist positive constants c1, c2 and n0 such that
0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0 }
If a function f(n) lies anywhere in between c1g(n) and c2 > g(n) for all n ≥ n0, then f(n) is
said to be asymptotically tight bound.
Asymptotic Notations (Example)

You might also like