0% found this document useful (0 votes)
1 views

22CS302_LM9

The document discusses algorithm efficiency and analysis, focusing on the importance of understanding time and space resources required for algorithms. It explains different types of algorithm analysis, including best, worst, and average cases, as well as asymptotic notations such as Big O, Omega, and Theta. The document emphasizes the significance of worst-case analysis in determining the upper bound of an algorithm's running time.

Uploaded by

poojask1636
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views

22CS302_LM9

The document discusses algorithm efficiency and analysis, focusing on the importance of understanding time and space resources required for algorithms. It explains different types of algorithm analysis, including best, worst, and average cases, as well as asymptotic notations such as Big O, Omega, and Theta. The document emphasizes the significance of worst-case analysis in determining the upper bound of an algorithm's running time.

Uploaded by

poojask1636
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

22XX302 - DATA STRUCTURES I

UNIT II & ALGORITHM EFFICIENCY

2.1 ALGORITHM EFFICIENCY USING ASYMPTOTIC


NOTATIONS
Algorithm Analysis:
● Algorithm analysis is an important part of computational complexity theory, which
provides theoretical estimation for the required resources of an algorithm to solve a
specific computational problem.
● Analysis of algorithms is the determination of the amount of time and space resources
required to execute it.
● In the analysis of the algorithm, it generally focused on CPU (time) usage, Memory
usage, Disk usage, and Network usage.
● All are important, but the most concern is about the CPU time.
● Performance: How much time/memory/disk/etc. is used when a program is run? This
depends on the machine, compiler, etc. as well as the code we write.
● Complexity: How do the resource requirements of a program or algorithm scale, i.e.
what happens as the size of the problem being solved by the code gets larger.
Note: Complexity affects performance but not vice-versa.

Why Analysis of Algorithms is important?


● To predict the behavior of an algorithm without implementing it on a specific computer.
● It is much more convenient to have simple measures for the efficiency of an algorithm
than to implement the algorithm and test the efficiency every time a certain parameter
in the underlying computer system changes.
● It is impossible to predict the exact behavior of an algorithm. There are too many
influencing factors.
● The analysis is thus only an approximation; it is not perfect.
● More importantly, by analyzing different algorithms, we can compare them to
determine the best one for our purpose.
Types of Algorithm Analysis:
● Best case
● Worst case
● Average case
Best case:
● Define the input for which algorithm takes less time or minimum time. In the best case
calculate the lower bound of an algorithm.
● Example: In the linear search when search data is present at the first location of large
data then the best case occurs.
Worst Case:
● Define the input for which algorithm takes a long time or maximum time. In the worst
calculate the upper bound of an algorithm.
● Example: In the linear search when search data is not present at all then the worst case
occurs.
Average case:
● In the average case take all random inputs and calculate the computation time for all
inputs.
● And then we divide it by the total number of inputs.
● Average case = all random case time / total no of case

Asymptotic Notation and Analysis


● Asymptotic notation of an algorithm is a mathematical representation of its complexity.
● Asymptotic notation is a way to describe the running time of an algorithm based on the
input size.
● It is commonly used in complexity analysis to describe how an algorithm performs as
the size of the input grows.
● The three most commonly used notations are Big O, Omega, and Theta.
● Using asymptotic analysis, we can very well conclude the best case, average case, and
worst-case scenario of an algorithm.
● Asymptotic notations are the general representation of time and space complexity of an
algorithm
● In Asymptotic notation, when we want to represent the complexity of an algorithm, we
use only the most significant terms in the complexity of that algorithm and ignore
least significant terms.
Types of Notations

● Big O notation (O - notation)


● Omega notation (Ω - notation)
● Theta notation (Θ - notation)
Big O notation (O - notation)
● Big - Oh notation is used to define the upper bound of an algorithm in terms of Time
Complexity.
● That means Big - Oh notation always indicates the maximum time required by an
algorithm for all input values.
● Big - Oh notation describes the worst case of an algorithm time complexity.
● The big-Oh notation allows us to ignore constant factors and lower-order terms and
focus on the main components of a function that affect its growth.
● Big - Oh Notation can be defined as follows.

✔ Consider function f(n) as time complexity of an algorithm and g(n) is the most
significant term.

✔ If f(n) <= C g(n) for all n >= n0, C > 0 and n0 >= 1. Then we can represent
f(n) as O(g(n)). f(n) = O(g(n))
● Consider the following graph drawn for the values of f(n) and C g(n) for input (n) value
on X-Axis and time required is on Y-Axis.
● In below graph after a particular input value n0, always C g(n) is greater than f(n)
which indicates the algorithm's upper bound.

Example:
Consider the following f(n) and g(n).
f(n) = 3n + 2
g(n) = n
If we want to represent f(n) as O(g(n)) then it must satisfy f(n) <= C g(n) for all values of C
> 0 and n0>= 1
f(n) <= C g(n)
⇒3n + 2 <= C n
Above condition is always TRUE for all values of C = 4 and n >= 2.
By using Big - Oh notation we can represent the time complexity as follows.
3n + 2 = O(n)
Omega notation (Ω - notation)
● This notation is used to define the lower bound of an algorithm in terms of Time
Complexity.
● That means Omega notation always indicates the minimum time required by an
algorithm for all input values.
● Omega notation describes the best case of an algorithm time complexity.
● Omega Notation can be defined as follows.
✔ Consider function f(n) as time complexity of an algorithm and g(n) is the most
significant term.
✔ If f(n) >= C g(n) for all n >= n0, C > 0 and n0 >= 1. Then we can represent
f(n) as Ω(g(n)). f(n) = Ω(g(n))
● Consider the following graph drawn for the values of f(n) and C g(n) for input (n) value
on X-Axis and time required is on Y-Axis.
● In below graph after a particular input value n0, always C g(n) is less than f(n) which
indicates the algorithm's lower bound.

Example:
Consider the following f(n) and g(n).
f(n) = 3n + 2
g(n) = n

If we want to represent f(n) as Ω(g(n)) then it must satisfy f(n) >= C g(n) for all values of C
> 0 and n0>= 1
f(n) >= C g(n)
⇒3n + 2 >= C n

Above condition is always TRUE for all values of C = 1 and n >= 1.


By using Omega notation we can represent the time complexity as follows...
3n + 2 = Ω(n)

Theta notation (Θ - notation)


● Theta notation is used to define the average bound of an algorithm in terms of Time
Complexity.
Theta notation always indicates the average time required by an algorithm for all
input values.
● Theta notation describes the average case of an algorithm time complexity.
● Theta Notation can be defined as follows.
✔ Consider function f(n) as time complexity of an algorithm and g(n) is the most
significant term. If C1 g(n) <= f(n) <= C2 g(n) for all n >= n0, C1 > 0, C2 > 0
and n0 >= 1. Then we can represent f(n) as Θ(g(n)).
✔ f(n) = Θ(g(n))
● Consider the following graph drawn for the values of f(n) and C g(n) for input (n) value
on X-Axis and time required is on Y-Axis.
● In below graph after a particular input value n0, always C1 g(n) is less than f(n) and
C2 g(n) is greater than f(n) which indicates the algorithm's average bound.

Example:
Consider the following f(n) and g(n).
f(n) = 3n + 2
g(n) = n
If we want to represent f(n) as Θ(g(n)) then it must satisfy C1 g(n) <= f(n) <= C2 g(n) for all
values of C1 > 0, C2 > 0 and n0>= 1
C1 g(n) <= f(n) <= C2 g(n)
⇒C1 n <= 3n + 2 <= C2 n
Above condition is always TRUE for all values of C1 = 1, C2 = 4 and n >= 2.
By using Theta notation we can represent the time complexity as follows.
3n + 2 = Θ(n)
Which Complexity analysis is generally used?

✔ Worst Case Analysis (Mostly used)


In the worst-case analysis, we calculate the upper bound on the running time of an
algorithm. We must know the case that causes a maximum number of operations to be
executed.
✔ Best Case Analysis (Very Rarely used)
In the best-case analysis, we calculate the lower bound on the running time of an
algorithm. We must know the case that causes a minimum number of operations to be
executed.
✔ Average Case Analysis (Rarely used)
In average case analysis, we take all possible inputs and calculate the computing time
for all of the inputs. Sum all the calculated values and divide the sum by the total
number of inputs.

You might also like