0% found this document useful (0 votes)
10 views48 pages

CSPC24 Chapter 3 - Analysis of The Efficiency of Algorithm

The document discusses the analysis of algorithms, focusing on their efficiency in terms of time and space complexity. It covers theoretical and empirical analysis methods, asymptotic notations (Big O, Omega, Theta), and the importance of understanding best, average, and worst-case scenarios. Additionally, it provides examples of calculating time and space complexities for various algorithms.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views48 pages

CSPC24 Chapter 3 - Analysis of The Efficiency of Algorithm

The document discusses the analysis of algorithms, focusing on their efficiency in terms of time and space complexity. It covers theoretical and empirical analysis methods, asymptotic notations (Big O, Omega, Theta), and the importance of understanding best, average, and worst-case scenarios. Additionally, it provides examples of calculating time and space complexities for various algorithms.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 48

CSPC 24

Algorithm and
Complexity
Analysis of The Efficiency of
Algorithm
Objectives
• Define and understand analysis algorithm

• Identify and understand the framework in analyzing the


efficiency of algorithm

• Apply the different asymptotic notation in comparing


the order of growth

• Differentiate recursive and non-recursive algorithm


ANALYSIS OF ALGORITHM
• the process of finding the computational complexity of
algorithms – the amount of time, storage, or other
resources needed to execute them.

• important part of computational complexity theory,


which provides theoretical estimation for the required
resources of an algorithm to solve a specific
computational problem.
ANALYSIS OF ALGORITHM
• A priori analysis − This is defined as theoretical
analysis of an algorithm. Efficiency of algorithm is
measured by assuming that all other factors e.g. speed
of processor, are constant and have no effect on
implementation.

• A posterior analysis − This is defined as empirical


analysis of an algorithm. The chosen algorithm is
implemented using programming language.
Issues:
• correctness
• time efficiency
• space efficiency
• Optimality

Approaches:
• theoretical analysis
• empirical analysis
THEORETICAL ANALYSIS OF TIME
EFFICIENCY
Time efficiency is analyzed by determining the number of
repetitions of the basic operation as a function of input size.

INPUT SIZE AND BASIC OPERATION


EXAMPLES
Problem Input size measure Basic Operation
Searching for key in a list of n Number of list’s items, i.e. n Key comparison
items
Multiplication of two matrices Matrix dimensions or total Multiplication of two numbers
number of elements
Checking primarility of a given n size = number of digits(in Division
integer n binary representation)
Typical graphical problem #vertices and/or edges Visiting a vertex or traversing
an edge
EMPIRICAL ANALYSIS OF TIME
EFFICIENCY
• Select a specific (typical) sample of inputs

• Use physical unit of time (e.g., milliseconds) or count


actual number of basic operation’s executions

• Analyze the empirical data


TIME EFFICIENCY
• The time complexity is the number of operations an
algorithm performs to complete its task with respect to
input size.

• The algorithm that performs the task in the smallest


number of operations is considered the most efficient
one.
SPACE EFFICIENCY
• Space Complexity of an algorithm denotes the total
space used or needed by the algorithm for its working,
for various input sizes.
• For example:
vector<int> myVec(n);
for (int i = 0; i < n; i++)
SPACE EFFICIENCY
• Space needed by an algorithm is equal to the sum of
the following two components
• A fixed part that is a space required to store certain
data and variables that are not dependent of the size of
the problem.
• A variable part is a space required by variables, whose
size is totally dependent on the size of the problem.
• Space complexity of an algorithm represents the
amount of memory space needed the algorithm in its
life cycle.
SPACE EFFICIENCY
Example

SUM (P, Q)
Step 1 – START
Step 2 - R ← P + Q + `1235099.790
Step 3 – Stop
BEST-CASE, AVERAGE-CASE,
WORST-CASE
For some algorithms efficiency depends on form of input:

• Worst case: Cworst (n) – maximum over inputs of size n

• Best case: Cbest (n) – minimum over inputs of size n

• Average case: Cavg (n) – “average” over inputs of size n


ORDER OF GROWTH
• Order of growth is how the time of execution depends on the
length of the input.
• Order of growth will help us to compute the running time with
ease.

Most important:
• Order of growth within a constant multiple as n→∞

Example:
• How much faster will algorithm run on computer that is twice as
fast?
ALGORITHM CASE EFFICIENCY
An algorithm can have different time for different inputs.
• Best case: This is the lower bound on running time of
an algorithm. We must know the case that causes the
minimum number of operations to be executed.
• Average case: We calculate the running time for all
possible inputs, sum all the calculated values and divide
the sum by the total number of inputs.
• Worst case: This is the upper bound on running time of
an algorithm. We must know the case that causes the
maximum number of operations to be executed.
ALGORITHM CASE EFFICIENCY
Example:

• We have one array named "arr" and an integer "k".

• We need to find if that integer "k" is present in the array


"arr" or not? If the integer is there, then return 1 other
return 0.

• Try to make an algorithm for this question.


ASYMPTOTIC NOTATION
• Use to analyze any algorithm and based on that we find
the most efficient algorithm.
• Asymptotic notation does not consider the system
configuration, rather we consider the order of growth
of the input.
• We try to find how the time or the space taken by the
algorithm will increase/decrease after
increasing/decreasing the input size.
BIG - O NOTATION
• The Big O notation defines an upper bound of an
algorithm, it bounds a function only from above.
• The Big O notation is useful when we only have upper
bound on time complexity of an algorithm.
OMEGA NOTATION (Ω)
• Describes best case scenario.
• It represents the lower bound running time complexity
of an algorithm.
• For a given function g(n), we denote by Ω(g(n)) the set
of functions.
Example:
• Ω (g(n)) = {f(n): there exist positive constants c and n0
such that 0 <=c*g(n) <= f(n) for all n >= n0}.
THETA NOTATION (Θ)
• Bounds a function from above and below, so it defines
exact asymptotic behavior.

• A simple way to get Theta notation of an expression is


to drop low order terms and ignore leading constants.
PROPERTIES OF ASYMPTOTIC
NOTATIONS
• General Properties: If f(n) is O(g(n)) then a*f(n) is also
O(g(n)); where a is a constant.

Example: f(n) = 2n2+5 is O(n2)


then 7*f(n) = 7(2n2+5)
= 14n2+35 is also O(n2)
PROPERTIES OF ASYMPTOTIC
NOTATIONS
• Reflexive Properties: If f(n) is given then f(n) is
O(f(n)).

Example: f(n) = n2; O(n2) i.e., O(f(n))


PROPERTIES OF ASYMPTOTIC
NOTATIONS
• Transitive Properties: If f(n) is O(g(n)) and g(n) is
O(h(n)) then f(n) = O(h(n)).

Example: if f(n) = n, g(n) = n2 and h(n)=n3


n is O(n2) and n2 is O(n3)
then n is O(n3)
PROPERTIES OF ASYMPTOTIC
NOTATIONS
• Symmetric Properties: If f(n) is Θ(g(n)) then g(n) is
Θ(f(n)).

Example: f(n) = n2 and g(n) = n2


then f(n) = Θ(n2) and g(n) = Θ(n2)
PROPERTIES OF ASYMPTOTIC
NOTATIONS
• Transpose Symmetric Properties: If f(n) is O(g(n))
then g(n) is Ω (f(n)).

Example: f(n) = n, g(n) = n2


then n is O(n2) and n2 is Ω (n)
Some More Properties:
• If f(n) = O(g(n)) and f(n) = Ω(g(n)) then f(n) = Θ(g(n))

• If f(n) = O(g(n)) and d(n)=O(e(n))


then f(n) + d(n) = O (max(g(n), e(n) ))

Example: f(n) = n i.e., O(n)


d(n) = n2 i.e., O(n2)
then f(n) + d(n) = n + n2 i.e., O(n2)
Some More Properties:
• If f(n)=O(g(n)) and d(n)=O(e(n))
then f(n) * d(n) = O(g(n) * e(n))

Example: f(n) = n i.e., O(n)


d(n) = n2 i.e., O(n2)
then f(n) * d(n) = n * n2 = n3 i.e., O(n3)
How to Calculate the Time
and Space Complexity
How to Calculate the Time
Complexity
• Break your algorithm/function into
individual operations.
• Calculate the Big O of each operation.
• Add up the Big O of each operation
together.
• Remove the constants.
• Find the highest order term.
Example 1(Time Complexity)
arr_x 1 20 5 4 9
sum = 0;
for(int i = 0; i < n; i++)
{
sum = sum + arr_x[i];
}
Example 1(Time Complexity)
Break your algorithm/function
arr_x 1 20 5 4 9 into individual operations.

sum = 0;
for(int i = 0; i < n; i++)
{
sum = sum + arr_x[i];
}
Example 1(Time Complexity) i=0
i=1
arr_x 1 20 5 4 9 Calculate the Big O of each operation. i = 2
i = 3
sum = 0; 1
i = 4
for(int i = 0; i < n; i++) 1+ n + 1+ n
i = 5
{
sum = sum + arr_x[i]; 1 * n = n
}
Example 1(Time Complexity)
arr_x 1 20 5 4 9 Add up the Big O of each operation together.
sum = 0; 1

for(int i = 0; i < n; i++) 1+ n + 1+ n


{
sum = sum + arr_x[i]; n
} 3n + 3
Example 1(Time Complexity)
arr_x 1 20 5 4 9 Remove the constants.
sum = 0; 1

for(int i = 0; i < n; i++) 1+ n + 1+ n


{
sum = sum + arr_x[i]; n
} 3n + 3
Example 1(Time Complexity)
arr_x 1 20 5 4 9 Find the highest order term.
sum = 0; 1

for(int i = 0; i < n; i++) 1+ n + 1+ n


{
sum = sum + arr_x[i]; n
} n

Time Complexity : O(n)


Example 1(Space Complexity)
arr_x 1 20 5 4 9 arr_x
sum = 0; sum
for(int i = 0; i < n; i++)
i
{
n
sum = sum + arr_x[i];
}
Example 1(Space Complexity)
arr_x 1 20 5 4 9 arr_x n
sum = 0; sum 1
for(int i = 0; i < n; i++)
i 1
{
n 1
sum = sum + arr_x[i];
}
Example 1(Space Complexity)
arr_x 1 20 5 4 9 arr_x n
sum = 0; sum 1
for(int i = 0; i < n; i++)
i 1
{
n 1
sum = sum + arr_x[i];
n+
}
3
Space Complexity : O(n)
Example 2(Time Complexity)
sum = 0; 1
for(int x = 0; x < n; x++) 1 + n+1 +
n
{
for(int y = 0; y < n; y++)
{
sum = sum + arr_x[y];
}
}
Example 2(Time Complexity)
sum = 0; 1
for(int x = 0; x < n; x++) 2 + 2n
{
for(int y = 0; y < n; y++)
{
sum = sum + arr_x[y];
}
}
Example 2(Time Complexity)
sum = 0; 1
for(int x = 0; x < n; x++) n
{
for(int y = 0; y < n; y++) n
{
n
sum = sum + arr_x[y];
}
}
Example 2(Time Complexity)
sum = 0; 1
for(int x = 0; x < n; x++) n
{
for(int y = 0; y < n; y++) n * 1+n+1+
n
{
n
sum = sum + arr_x[y];
}
}
Example 2(Time Complexity)
sum = 0; 1
for(int x = 0; x < n; x++) n
{
for(int y = 0; y < n; y++) n * 2+
2n
{
n
sum = sum + arr_x[y];
}
}
Example 2(Time Complexity)
sum = 0; 1
for(int x = 0; x < n; x++) n
{
for(int y = 0; y < n; y++) n * n
{
n
sum = sum + arr_x[y];
}
}
Example 2(Time Complexity)
sum = 0; 1
for(int x = 0; x < n; x++) n
{
for(int y = 0; y < n; y++) n2
{
n2
sum = sum + arr_x[y];
} 2n2 + n +
} 1
Example 2(Time Complexity)
sum = 0; 1
for(int x = 0; x < n; x++) n
{
for(int y = 0; y < n; y++) n2
{
n2
sum = sum + arr_x[y];
} 2n2 + n +
} 1
Example 2(Time Complexity)
sum = 0; 1
for(int x = 0; x < n; x++) n
{
for(int y = 0; y < n; y++) n 2

{
n2
sum = sum + arr_x[y];
} 2n2 + n +
} 1 2)
Time Complexity : O(n
Example 2(Space Complexity)
sum = 0; arr_x n
for(int x = 0; x < n; x++) sum 1
{ x 1
for(int y = 0; y < n; y++) y 1
{ n 1
sum = sum + arr_x[y];
n+4
}
} Space Complexity : O(n)
Thank you!
REFERENCES:
Cormen T., Leiserson C., Rivest R., & Stein C. (2009). Introduction to Algorithms Third
Edition. The MIT Press, Cambridge, Massachusets
Fleck, Margaret M., (2013). Building Blocks for Theoretical Computer Science. Version
1.3 (January 2013)
Lehman, Eric F., Leighton, Thomson & Meyer, Albert R. (2018). Mathematics for
Computer Science. June 2018
ONLINE REFERENCES:
https://www.geeksforgeeks.org/
http://www.freebookcentre.net/CompuScience/free-computer-algorithm-books.html

You might also like