CS 203 Data Structure & Algorithms: Performance Analysis
CS 203 Data Structure & Algorithms: Performance Analysis
Performance Analysis
Properties of Algorithm
• Every Algorithm must satisfy the following properties:
– Input- There should be 0 or more inputs supplied externally to the
algorithm.
– Output- There should be atleast 1 output obtained.
– Definiteness- Every step of the algorithm should be clear and well
defined.
– Finiteness- The algorithm should have finite number of steps.
– Correctness- Every step of the algorithm must generate a correct
output.
• An algorithm is said to be efficient and fast, if it takes less time
to execute and consumes less memory space.
• The performance of an algorithm is measured on the basis of
following properties :
– Time Complexity
– Space Complexity
Performance Analysis
• Why?
– There are multiple algorithms to solve a problem.
– If more than one algorithm to solve a problem, we need to
select the best one.
• Definition
– Performance of an algorithm is a process of making
evaluative judgement about algorithms.
– In other way, performance of an algorithm means
predicting the resources which are required to an
algorithm to perform its task.
Performance Analysis
• Performance analysis of an algorithm is the process
of calculating space required by that algorithm and
time required by that algorithm.
– Space Complexity: Space required to complete the task of
that algorithm
• includes program space
• data space
– Time Complexity: Time required to complete the task of
that algorithm (Time Complexity)
Space Complexity
• Total amount of computer memory required by an algorithm
to complete its execution is called as space complexity of that
algorithm
• For any algorithm, memory is required for the following
purposes...
– Memory required to store program instructions
– Memory required to store constant values
– Memory required to store variable values
– And for few other things
• Generally, when a program is under execution it uses the
computer memory for THREE reasons.
– Instruction Space: It is the amount of memory used to store compiled
version of instructions.
– Environmental Stack: It is the amount of memory used to store
information of partially executed functions at the time of function call.
– Data Space: It is the amount of memory used to store all the variables
and constants.
Space Complexity: Example
• Consider only Data Space and ignore Instruction Space as well as
Environmental Stack
4 Bytes
int square(int a) {
return a*a;
This 8 bytes of memory is
} fixed for any input value of 'a'
4 Bytes
(memory is used for return value)
Assume:
4 bytes to store Integer value,
4 bytes to store Floating Point value,
1 byte to store Character value,
8 bytes to store double value
If any algorithm requires a fixed amount of space for all input values then that
space complexity is said to be Constant Space Complexity
Space Complexity: Example
• Consider only Data Space and ignore Instruction Space as well as
Environmental Stack n*4’ Bytes 4 Bytes
4 Bytes
4 Bytes
int sum(int A[], int n) {
int sum = 0, i;
Total 4n+16 Bytes
for(i = 0; i < n; i++)
sum = sum + A[i]; The amount of memory depends on the
return sum; input value of 'n'. This space complexity is
} 4 Bytes said to be Linear Space Complexity
(memory is used for return value)
Assume:
4 bytes to store Integer value,
4 bytes to store Floating Point value,
1 byte to store Character value,
8 bytes to store double value
If any program requires fixed amount of time for all input values then its
time complexity is said to be Constant Time Complexity.
Time Complexity: Example
int sum(int A[], int n){
int sum = 0, i; • Total 4n+4 units of time to complete its
for(i = 0; i < n; i++) execution
sum = sum + A[i]; • Linear Time Complexity.
return sum;
}
• Algorithm 1 : 5n2 + 2n + 1
• Algorithm 2 : 10n2 + 8n + 3
f(n) = O(g(n))
Asymptotic Notation: Big - Oh Notation (O)
• Example
– Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
If we want to represent f(n) as O(g(n)) then it must satisfy f(n) <= C x
g(n) for all values of C > 0 and n0>= 1
– f(n) <= C g(n)
3n + 2 <= C n
f(n) = Ω(g(n))
Asymptotic Notation: Big - Omege Notation (Ω)
• Example
– Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
If we want to represent f(n) as Ω(g(n)) then it must satisfy f(n) >= C
g(n) for all values of C > 0 and n0>= 1
– f(n) >= C g(n)
3n + 2 <= C n
• Above condition is always TRUE for all values of C = 1 and n >= 1.
By using Big - Omega notation we can represent the time complexity as
follows...
3n + 2 = Ω(n)
Asymptotic Notation: Big - Theta Notation (Θ)
• Big - Theta notation is used to define the average bound of
an algorithm in terms of Time Complexity.
• Consider function f(n) the time complexity of an algorithm
and g(n) is the most significant term. If C1 g(n) <= f(n) >= C2
g(n) for all n >= n0, C1, C2 > 0 and n0 >= 1. Then we can
represent f(n) as Θ(g(n)).
•
f(n) = Θ(g(n))
Asymptotic Notation: Big - Theta Notation (Θ)
• Example
– Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
If we want to represent f(n) as Θ(g(n)) then it must satisfy C1 g(n) <=
f(n) >= C2 g(n) for all values of C1, C2 > 0 and n0>= 1
– C1 g(n) <= f(n) >= C2 g(n)
C1 n <= 3n + 2 >= C2 n