How To Find Time Complexity of An Algorithm
How To Find Time Complexity of An Algorithm
algorithm?
• Finding out the time complexity of your code can help you develop better programs
that run faster.
• In general, you can determine the time complexity by analyzing the program’s
statements (go line by line).
• However, you have to be mindful how are the statements arranged. Suppose they
are inside a loop or have function calls or even recursion.
• All these factors affect the runtime of your code.
Big O Notation
• Big O notation is a framework to analyze and compare algorithms.
• Amount of work the CPU has to do (time complexity) as the input size grows
(towards infinity).
• Big O = Big Order function. Drop constants and lower order terms. E.g., O(3*n^2 +
10n + 10) becomes O(n^2).
• Big O notation cares about the worst-case scenario. E.g., when you want to sort and
elements in the array are in reverse order for some sorting algorithms.
If you have a function that takes an array as an input, if you increase the number of
elements in the collection, you still perform the same operations; you have a constant
runtime.
If the CPU’s work grows proportionally to the input array size, you have a linear
runtime O(n).
Sequential Statements
If we have statements with basic operations like comparisons, assignments, reading a
variable. We can assume they take constant time each O(1).
statement1;
statement2;
...
statementN;
Let’s use T(n) as the total time in function of the input size n, and t as the time complexity
taken by a statement or group of statements.
T(n) = t(statement1) + t(statement2) + ... + t(statementN);
If each statement executes a basic operation, we can say it takes constant time O(1). As
long as you have a fixed number of operations, it will be constant time, even if we have 1
or 100 of these statements.
Example:
function squareSum(a, b, c) {
const sa = a * a;
const sb = b * b;
const sc = c * c;
return sum;
As you can see, each statement is a basic operation (math and assignment). Each line
takes constant time O(1).
If we add up all statements’ time it will still be O(1). It doesn’t matter if the numbers
are 0 or 9,007,199,254,740,991, it will perform the same number of operations.
Conditional Statements
Remember that we care about the worst-case with Big O so that we will take the
maximum possible runtime.
if (isValid) {
statement1;
statement2;
} else {
statement3;
Example:
if (isValid) {
array.sort();
return true;
} else {
return false;
The if block has a runtime of O(n log n) (that’s common runtime for efficient sorting
algorithms). The else block has a runtime of O(1).
Since n log n has a higher order than n, we can express the time complexity as O(n log n).
Loop Statements
Another prevalent scenario is loops like for-loops or while-loops
Linear Time Loops
For any loop, we find out the runtime of the block inside them and multiply it by the
number of times the program will repeat the loop.
for (let i = 0; i < array.length; i++) {
statement1;
statement2;
For this example, the loop is executed array.length, assuming n is the length of the
array, we get the following:
All loops that grow proportionally to the input size have a linear time complexity O(n).
If you loop through only half of the array, that’s still O(n). Remember that we drop the
constants so 1/2 n => O(n).
Constant-Time Loops
However, if a constant number bounds the loop, let’s say 4 (or even 400). Then, the
runtime is constant O(4) -> O(1).
See the following example.
for (let i = 0; i < 4; i++) {
statement1;
statement2;
That code is O(1) because it no longer depends on the input size. It will always run
statement 1 and 2 four times.
Consider the following code, where we divide an array in half on each iteration (binary
search):
function fn1(array, target, low = 0, high = array.length - 1) {
let mid;
high = mid - 1;
low = mid + 1;
else break;
return mid;
This function divides the array by its middle point on each iteration. The while loop will
execute the number of times that we can divide array.length in half. We can calculate this
using the log function.
E.g.: If the array’s length is 8, then we the while loop will execute 3 times because log2(8)
= 3.
statement1;
statement2;
statement3;
Assuming the statements from 1 to 3 are O(1), we would have a runtime of O(n * m).
If instead of m, you had to iterate on n again, then it would be O(n^2). Another typical case
is having a function inside a loop.
Let’s see how to deal with that next.
Depending on the runtime of fn1, fn2, and fn3, you would have different runtimes.
• If they all are constant O(1), then the final runtime would be O(n^3).
• However, if only fn1 and fn2 are constant and fn3 has a runtime of O(n^2), this
program will have a runtime of O(n^5).
• Another way to look at it is, if fn3 has two nested and you replace the invocation
with the actual implementation, you would have five nested loops.
function fn(n) {
if (n < 0) return 0;
if (n < 2) return n;
• When your n = 2, you have 3 function calls. First fn(2) which in turn
calls fn(1) and fn(0).
• For n = 3, you have 5 function calls. First fn(3), which in turn
calls fn(2) and fn(1) and so on.
• For n = 4, you have 9 function calls. First fn(4), which in turn
calls fn(3) and fn(2) and so on.
Since it’s a binary tree, we can sense that every time n increases by one, we would have
to perform at most the double of operations.
If you take a look at the generated tree calls, the leftmost nodes go down in descending
order: fn(4), fn(3), fn(2), fn(1), which means that the height of the tree (or the number of
levels) on the tree will be n.
The total number of calls, in a complete binary tree, is 2^n - 1.
As you can see in fn(4), the tree is not complete. The last level will only have two
nodes, fn(1) and fn(0), while a complete tree would have 8 nodes.
But still, we can say the runtime would be exponential O(2^n). It won’t get any worse
because 2^n is the upper bound.