Time Complexity
Time Complexity
By: Manoj
Time Complexity
• Defined as the amount of time taken by an algorithm to run, as a function of the
length of the input.
• The idea behind time complexity is that it can only measure the algorithm's
execution time in a way that is dependent solely on the algorithm and its input.
Common notations used to express time complexity are
• Big-oh(O) Notation: Denotes the worst case of an algorithm.
• Irrespective of the input size n, the runtime will always be the same.
#include <iostream>
using namespacse std;
int main() {
int x = 42;
cout << "The value of x is: " << x << endl;
return 0;
}
Example 2
#include <iostream>
int main() {
int num = 5;
int result = square(num);
std::cout << "The square of " << num << " is: " << result << std::endl;
return 0;
}
• In this example, the square function calculates the square of an integer n.
• The time it takes to execute this function is constant, regardless of the value of
n.
• When the function involves checking all the values in input data, with this order
O(n).
• Example: This is like making a sandwich for each person at a picnic. If you have
10 people, you make 10 sandwiches. If you have 100 people, you make 100
sandwiches. The time it takes grows directly with the number of people.
Example
int main() {
int n;
cin >> n;
int sum = 0;
sum += i;
cout << "The sum of numbers from 1 to " << n << " is: " << sum << endl;
return 0;
}
• In this code, the program calculates the sum of numbers from 1 to the input value n
using a loop.
• The time it takes to execute the loop is directly proportional to the value of n, so the
time complexity is O(n).
Logarithmic time – O (log n)
• Logarithmic time complexity means the number of operations decreases as the input size
increases.
• Example: Imagine we have a phone book with a lot of names, and we're trying to find a
name. We can quickly narrow down our search by looking in the middle of the book first,
and then in the middle of the remaining half, and so on. It's faster than looking at every
page one by one.
Quadratic time – O (n^2)
• The execution time grows with the square of the input size.
• Exampe: Imagine comparing each item in a list to every other item. If you have 10
items, it's like doing 10x10 = 100 comparisons.
Example 1
int findMax(int arr[], int size) { int myArray[] = {12, 5, 21, 8, 17, 6};
int maxElement = arr[0]; int arraySize = 6;
return maxElement;
}
int main() {
• In this code, the findMax function uses two nested loops to compare every
element in the array with every other element to find the maximum.
• Since there are two nested loops, the time complexity of this algorithm is O(n^2)
Exponential Time - O(2^n)
• The execution time grows exponentially with the input size.
• As the size of the input data (n) increases, the number of operations performed
by the algorithm increases at a rate
• Example: MergeSort
mathematical analysis of recursive algorithm
• Mathematical analysis of recursive algorithms involves determining their time
complexity using mathematical notation and techniques.
• The most common mathematical notation for expressing time complexity is Big O
notation (O),
Example:(The algorithm calculates the sum of elements in an
array using recursion)