0% found this document useful (0 votes)
4 views

Lecture 04 Calculating Time Complexity

The document discusses the concepts of time and space complexity in algorithms, emphasizing their importance in evaluating the efficiency of programs. It outlines different types of time complexities, such as constant, linear, exponential, and logarithmic, along with examples of calculating these complexities. Additionally, it explains space complexity and its components, providing a formula for its calculation.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Lecture 04 Calculating Time Complexity

The document discusses the concepts of time and space complexity in algorithms, emphasizing their importance in evaluating the efficiency of programs. It outlines different types of time complexities, such as constant, linear, exponential, and logarithmic, along with examples of calculating these complexities. Additionally, it explains space complexity and its components, providing a formula for its calculation.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 19

Lecture 04

Calculating Time Complexity

Ms. Madiha Rehman


[email protected]

Institute of Computer Science


Khwaja Fareed UEIT
1
Complexity Analysis
• The primary motive to use DSA is to solve a
problem effectively and efficiently.
• How can you decide if a program written by you is
efficient or not?
• This is measured by complexities.
• Complexity is of two types:
• Time
• Space
Space Complexity
• The space Complexity of an algorithm is the space
taken by an algorithm to run the program for a
given input size.
• The program has some space requirements
necessary for its execution these include auxiliary
space and input space.
• The important standard for comparison of
algorithms is the space taken by the algorithm to
run for a given input size
• Hence it needs to be optimized.
Time Complexity
• Time complexity of an algorithm represents the amount of
time required by the algorithm to run to completion.
• Time requirements can be defined as a numerical function
T(n),
• where T(n) can be measured as the number of steps,
• For example, addition of two n-bit integers takes n steps.
• Thus, total computational time is:
T(n) = c ∗ n,
• where c is the time taken for the addition of two bits.
• Here, we observe that T(n) grows linearly as the input size
increases.
Measuring Complexities
Below are some major order of Complexities are:
• Constant: If the algorithm runs for the same amount of
time every time irrespective of the input size. It is said
to exhibit constant time complexity.
• Linear: If the algorithm runtime is linearly proportional
to the input size then the algorithm is said to exhibit
linear time complexity.
• Exponential: If the algorithm runtime depends on the
input value raised to an exponent then it is said to
exhibit exponential time complexity.
• Logarithmic: When the algorithm runtime increases
very slowly compared to an increase in input size i.e.
logarithm of input size then the algorithm is said to
exhibit logarithmic time complexity.
Measurement of
Complexity analysis
Calculating Time
Complexity
• Loop - Multiplication

for(i=1;i<=n;i++) //n multiply


{ n x c = cn
a=a+b; //constant
}

c*n = cn = O(n)
Calculating Time
Complexity
• Nested Loop - Multiplication

for(i=1;i<=n;i++) //n
{ multiply
for(j=1;j<=n;j++) //n n x n = n2
{
a=a+b; //constant c x n2
}
}
c*n*n = cn2 = O(n2)
Calculating Time
Complexity
• Consecutive Statements - Addition
a = a + b //c1
{
for(i=1;i<=n;i++) //n
{ X c2n addition
x = x + y; //c2 c1 + c2n+c3n
}
for(j=1;j<=n;j++) //n
{ X c3n
c = c + d; //c3
}
}
c1 + c2n+c3n = O(n)
Calculating Time
Complexity
• if-else Statements
if (cond.)
{
for (i=1;i<=n;i++)
{
O(n)
a = a + b;
}
} Worst Case
else O(n) + O(n2) O(n2)
{
for(i=1;i<=n;i++)
{
for(j=1;j<=n;j++) O(n2)
{
c = c + d;
}
}
}
Calculating Time
Complexity
• Logarithmic Complexity log(n):
• Represented in Big O notation as O(log
n),
• When an algorithm has O(log n) running
time, it means that:
• as the input size grows, the number of operations
grows very slowly.
• Example: binary search.
Logarithmic Example
• Example 1: loga b
• Task: We have a number N which has an initial
value of 16 and the task is to reduce the given
number to 1 by repeated division of 2.
Approach:
• Initialize a variable number_of_operation with a
value 0 .
• Run a for loop from N till 1.
• In each iteration reduce the value of N to half.
• Increment the number_of_operation variable by one.
• Return the number_of_operation variable.
// C++ code for reducing a number to its
logarithm
#include <bits/stdc++.h>
using namespace std;

int main()
{

int N = 16;
int number_of_operations = 0;

cout << "Logarithmic reduction of N: ";


for (int i = N; i > 1; i = i / 2) {
cout << i << " ";
number_of_operations++;
}
cout << '\n'
<< "Algorithm Runtime for reducing N to
1: "
<< number_of_operations;
Logarithmic Example
• There are several standard algorithms
that have logarithmic time complexity:
• Merge sort
• Binary search
• Heap sort
Algorithm Complexity
• Suppose X is an algorithm and
• n is the size of input data,

• Time Factor − Time is measured by counting the


number of key operations such as comparisons in
the sorting algorithm.
• Space Factor − Space is measured by counting the
maximum memory space required by the
algorithm.
Space Complexity
• Space complexity of an algorithm represents the
amount of memory space required by the algorithm
in its life cycle.
• The space required by an algorithm is equal to the
sum of the following two components
• A fixed part that is a space required to store certain data
and variables, that are independent of the size of the
problem. For example, simple variables and constants
used, program size, etc.
• A variable part is a space required by variables, whose size
depends on the size of the problem. For example,
dynamic memory allocation, recursion stack space, etc.
Space Complexity
• Space complexity S(P) of any algorithm P is
• S(P) = C + S(I),
• where C is the fixed part and
• S(I) is the variable part of the algorithm
Space Complexity
• Following is a simple example that explains the concept −

Algorithm: SUM(A, B)
Step 1 - START
Step 2 - C ← A + B + 10
Step 3 - Stop
• Here we have three variables A, B, and C
• One constant 10.
• Hence S(P) = 1 + 3. Now, space depends on data types of given
variables and constant types and it will be multiplied accordingly.

You might also like