0% found this document useful (0 votes)
620 views

CS 203 Data Structure & Algorithms: Performance Analysis

This document discusses performance analysis of algorithms. It defines that the performance of an algorithm is measured based on time complexity and space complexity. Time complexity is the total time required by an algorithm to complete its execution, which depends on the input size. Space complexity is the total memory required by an algorithm to complete its execution. The document then provides examples to explain constant, linear, quadratic and logarithmic time complexities, and constant and linear space complexities. It also introduces asymptotic notations like Big-O, Big-Omega and Big-Theta notations to define the upper bound, lower bound and average bounds of algorithms.

Uploaded by

Sreeja
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
620 views

CS 203 Data Structure & Algorithms: Performance Analysis

This document discusses performance analysis of algorithms. It defines that the performance of an algorithm is measured based on time complexity and space complexity. Time complexity is the total time required by an algorithm to complete its execution, which depends on the input size. Space complexity is the total memory required by an algorithm to complete its execution. The document then provides examples to explain constant, linear, quadratic and logarithmic time complexities, and constant and linear space complexities. It also introduces asymptotic notations like Big-O, Big-Omega and Big-Theta notations to define the upper bound, lower bound and average bounds of algorithms.

Uploaded by

Sreeja
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

CS 203

Data Structure & Algorithms

Performance Analysis
Properties of Algorithm
• Every Algorithm must satisfy the following properties:
– Input- There should be 0 or more inputs supplied externally to the
algorithm.
– Output- There should be atleast 1 output obtained.
– Definiteness- Every step of the algorithm should be clear and well
defined.
– Finiteness- The algorithm should have finite number of steps.
– Correctness- Every step of the algorithm must generate a correct
output.
• An algorithm is said to be efficient and fast, if it takes less time
to execute and consumes less memory space.
• The performance of an algorithm is measured on the basis of
following properties :
– Time Complexity
– Space Complexity
Performance Analysis
• Why?
– There are multiple algorithms to solve a problem.
– If more than one algorithm to solve a problem, we need to
select the best one.
• Definition
– Performance of an algorithm is a process of making
evaluative judgement about algorithms.
– In other way, performance of an algorithm means
predicting the resources which are required to an
algorithm to perform its task.
Performance Analysis
• Performance analysis of an algorithm is the process
of calculating space required by that algorithm and
time required by that algorithm.
– Space Complexity: Space required to complete the task of
that algorithm
• includes program space
• data space
– Time Complexity: Time required to complete the task of
that algorithm (Time Complexity)
Space Complexity
• Total amount of computer memory required by an algorithm
to complete its execution is called as space complexity of that
algorithm
• For any algorithm, memory is required for the following
purposes...
– Memory required to store program instructions
– Memory required to store constant values
– Memory required to store variable values
– And for few other things
• Generally, when a program is under execution it uses the
computer memory for THREE reasons.
– Instruction Space: It is the amount of memory used to store compiled
version of instructions.
– Environmental Stack: It is the amount of memory used to store
information of partially executed functions at the time of function call.
– Data Space: It is the amount of memory used to store all the variables
and constants.
Space Complexity: Example
• Consider only Data Space and ignore Instruction Space as well as
Environmental Stack
4 Bytes

int square(int a) {
return a*a;
This 8 bytes of memory is
} fixed for any input value of 'a'
4 Bytes
(memory is used for return value)
Assume:
4 bytes to store Integer value,
4 bytes to store Floating Point value,
1 byte to store Character value,
8 bytes to store double value

If any algorithm requires a fixed amount of space for all input values then that
space complexity is said to be Constant Space Complexity
Space Complexity: Example
• Consider only Data Space and ignore Instruction Space as well as
Environmental Stack n*4’ Bytes 4 Bytes
4 Bytes
4 Bytes
int sum(int A[], int n) {
int sum = 0, i;
Total 4n+16 Bytes
for(i = 0; i < n; i++)
sum = sum + A[i]; The amount of memory depends on the
return sum; input value of 'n'. This space complexity is
} 4 Bytes said to be Linear Space Complexity
(memory is used for return value)
Assume:
4 bytes to store Integer value,
4 bytes to store Floating Point value,
1 byte to store Character value,
8 bytes to store double value

If the amount of space required by an algorithm is increased with the increase of


input value, then that space complexity is said to be Linear Space Complexity
Time Complexity
• The time complexity of an algorithm is the total amount of
time required by an algorithm to complete its execution.
• Running time of an algorithm depends upon the following...
– Whether it is running on Single processor machine or Multi processor
machine.
– Whether it is a 32 bit machine or 64 bit machine
– Read and Write speed of the machine.
– The time it takes to perform Arithmetic operations, logical operations,
return value and assignment operations etc.,
– Input data

When we calculate time complexity of an algorithm, we consider only input


data and ignore the remaining things, as they are machine dependent
Time Complexity
for(i=0; i < N; i++) { for(i=0; i < N; i++) {
statement; for(j=0; j < N;j++) {
} statement;
}
}
• In server machine, it
• In desktop machine, it
requires 0.1ns
requires 0.2ns

Which one is better?

When we calculate time complexity of an algorithm, we consider only input


data and ignore the remaining things, as they are machine dependent
Time Complexity: Example
int sum(int a, int b){ • 1 unit of time to calculate a+b .
return a+b; • 1 unit of time to return the value.
} • Total 2 units of time to complete
its execution

If any program requires fixed amount of time for all input values then its
time complexity is said to be Constant Time Complexity.
Time Complexity: Example
int sum(int A[], int n){
int sum = 0, i; • Total 4n+4 units of time to complete its
for(i = 0; i < n; i++) execution
sum = sum + A[i]; • Linear Time Complexity.
return sum;
}

If the amount of time required by an algorithm is increased with the


increase of input value then that time complexity is said to be Linear Time
Complexity
Time Complexity: Example
for(i=0; i < N; i++){
statement; statement;
}

Constant Time Complexity Linear Time Complexity

for(i=0; i < N; i++){ while(low <= high) {


for(j=0; j < N;j++){ mid = (low + high) / 2;
statement; if (target < list[mid])
} high = mid - 1;
} else if (target > list[mid])
low = mid + 1;
Quadratic Time Complexity else break;
}

Logarithmic Time Complexity


Asymptotic Notation
• Asymptotic notation of an algorithm is a mathematical
representation of its complexity
– To represent the complexity of an algorithm, consider only the most
significant terms in the complexity of that algorithm and ignore least
significant terms in the complexity of that algorithm

• Algorithm 1 : 5n2 + 2n + 1

• Algorithm 2 : 10n2 + 8n + 3

• Majorly, we use THREE types of Asymptotic Notations and


those are as follows...
– Big - Oh (O)
– Big - Omega (Ω)
– Big - Theta (Θ)
Asymptotic Notation: Big - Oh Notation (O)
• Big - Oh notation is used to define the upper bound of an
algorithm in terms of Time Complexity.
• Consider function f(n) the time complexity of an algorithm
and g(n) is the most significant term. If f(n) <= C g(n) for all
n >= n0, C > 0 and n0 >= 1. Then we can represent f(n) as
O(g(n)).

f(n) = O(g(n))
Asymptotic Notation: Big - Oh Notation (O)
• Example
– Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
If we want to represent f(n) as O(g(n)) then it must satisfy f(n) <= C x
g(n) for all values of C > 0 and n0>= 1
– f(n) <= C g(n)
3n + 2 <= C n

Above condition is always TRUE for all values of C = 4 and n >= 2.


By using Big - Oh notation we can represent the time complexity as
follows...
3n + 2 = O(n)
Asymptotic Notation: Big - Omege Notation (Ω)
• Big - Omega notation is used to define the lower bound of
an algorithm in terms of Time Complexity.
• Consider function f(n) the time complexity of an algorithm
and g(n) is the most significant term. If f(n) >= C x g(n) for all
n >= n0, C > 0 and n0 >= 1. Then we can represent f(n) as
Ω(g(n)).

f(n) = Ω(g(n))
Asymptotic Notation: Big - Omege Notation (Ω)
• Example
– Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
If we want to represent f(n) as Ω(g(n)) then it must satisfy f(n) >= C
g(n) for all values of C > 0 and n0>= 1
– f(n) >= C g(n)
3n + 2 <= C n
• Above condition is always TRUE for all values of C = 1 and n >= 1.
By using Big - Omega notation we can represent the time complexity as
follows...
3n + 2 = Ω(n)
Asymptotic Notation: Big - Theta Notation (Θ)
• Big - Theta notation is used to define the average bound of
an algorithm in terms of Time Complexity.
• Consider function f(n) the time complexity of an algorithm
and g(n) is the most significant term. If C1 g(n) <= f(n) >= C2
g(n) for all n >= n0, C1, C2 > 0 and n0 >= 1. Then we can
represent f(n) as Θ(g(n)).

f(n) = Θ(g(n))
Asymptotic Notation: Big - Theta Notation (Θ)
• Example
– Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
If we want to represent f(n) as Θ(g(n)) then it must satisfy C1 g(n) <=
f(n) >= C2 g(n) for all values of C1, C2 > 0 and n0>= 1
– C1 g(n) <= f(n) >= C2 g(n)
C1 n <= 3n + 2 >= C2 n

• Above condition is always TRUE for all values of C1 = 1, C2 = 4 and n >= 1.


By using Big - Theta notation we can represent the time compexity as
follows...
3n + 2 = Θ(n)
End of Lecture -2
Reference
• http://btechsmartclass.com

You might also like