Different Complexities With Suitable Examples
Different Complexities With Suitable Examples
Big O
Notation Name Example(s)
Exponent
O(2n) # Find all subsets
ial
In algorithm analysis, complexities describe the relationship between the size of the
input and the resources (such as time or space) required by an algorithm. Here are
some common complexities and examples:
It's important to note that complexities like O(1), O(log n), O(n log n), and O(n) are
generally considered efficient, while higher complexities like O(n^2), O(2^n), and
O(n!) are less efficient and can become impractical for large inputs. The choice of
algorithm and its complexity class depends on the specific requirements and
constraints of the problem at hand.
Odd or Even
Find if a number is odd or even.
1function isEvenOrOdd(n) {
2 return n % 2 ? 'Odd' : 'Even';
3}
4
5console.log(isEvenOrOdd(10)); // => Even
6console.log(isEvenOrOdd(10001)); // => Odd
Do not be fooled by one-liners. They don’t always translate to constant times. You
have to be aware of how they are implemented.
Look-up table
Given a string, find its word frequency data.
1const dictionary = {the: 22038615, be: 12545825, and: 10741073, of: 10343885, a:
10144200, in: 6996437, to: 6332195 /* ... */};
2
3
function getWordFrequency(dictionary, word) {
4
return dictionary[word];
5
}
6
7
console.log(getWordFrequency(dictionary, 'the'));
8
console.log(getWordFrequency(dictionary, 'in'));
Only a hash table with a perfect hash function will have a worst-case runtime of O(1).
The ideal hash function is not practical, so there will be some collisions and
workarounds that leads to a worst-case runtime of O(n). Still, on average, the lookup
time is O(1).
1function findMax(n) {
2 let max;
3 let counter = 0;
4
5 for (let i = 0; i < n.length; i++) {
6 counter++;
7 if(max === undefined || max < n[i]) {
8 max = n[i];
9 }
10 }
11
12 console.log(`n: ${n.length}, counter: ${counter}`);
13 return max;
14}
1findMax([3, 1, 2]);
2// n: 3, counter: 3
or if n has 9 elements:
1findMax([4,5,6,1,9,2,8,3,7])
2// n: 9, counter: 9
Has duplicates
You want to find duplicate words in an array. A naïve
solution will be the following:
1function hasDuplicates(n) {
2 const duplicates = [];
3 let counter = 0; // debug
4
5 for (let outter = 0; outter < n.length; outter++) {
6 for (let inner = 0; inner < n.length; inner++) {
7 counter++; // debug
8
9 if(outter === inner) continue;
10
11 if(n[outter] === n[inner]) {
12 return true;
13 }
14 }
15 }
16
17 console.log(`n: ${n.length}, counter: ${counter}`); // debug
18 return false;
19}
We get 3n^2 + 2.
Again, when we have an asymptotic analysis, we drop all
constants and leave the most important term: n^2. So, in big
O notation, it would be O(n^2).
1hasDuplicates([1,2,3,4]);
2// n: 4, counter: 16
1hasDuplicates([1,2,3,4,5,6,7,8,9]);
2// n: 9, counter: 81
Bubble sort
We want to sort the elements in an array. One way to do this
is using bubble sort as follows:
1function sort(n) {
2 for (let outer = 0; outer < n.length; outer++) {
3 let outerElement = n[outer];
4
5 for (let inner = outer + 1; inner < n.length; inner++) {
6 let innerElement = n[inner];
7
8 if(outerElement > innerElement) {
9 // swap
10 n[outer] = innerElement;
11 n[inner] = outerElement;
12 // update references
13 outerElement = n[outer];
14 innerElement = n[inner];
15 }
16 }
17 }
18 return n;
19}
Also, you might notice that for a very big n, the time it takes
to solve the problem increases a lot. Can you spot the
relationship between nested loops and the running time?
When a function has a single loop, it usually translates into a
running time complexity of O(n). Now, this function has 2
nested loops and quadratic running time: O(n2).
3x + 9y + 8z = 79
This naïve program will give you all the solutions that satisfy
the equation where x, y, and z < n.
1function findXYZ(n) {
2 const solutions = [];
3
4 for(let x = 0; x < n; x++) {
5 for(let y = 0; y < n; y++) {
6 for(let z = 0; z < n; z++) {
7 if( 3*x + 9*y + 8*z === 79 ) {
8 solutions.push({x, y, z});
9 }
10 }
11 }
12 }
13
14 return solutions;
15}
16
17console.log(findXYZ(10)); // => [{x: 0, y: 7, z: 2}, ...]
Algorithm A:
Algorithm B:
1. Open the book in the middle and check the first word
on it.
2. If the word that you are looking for is alphabetically
more significant, then look to the right. Otherwise, look
in the left half.
3. Divide the remainder in half again, and repeat step #2
until you find the word you are looking for.
Which one is faster? The first algorithms go word by
word O(n), while the algorithm B split the problem in half on
each iteration O(log n). This 2nd algorithm is a binary
search.
Binary search
Find the index of an element in a sorted array.
where:
nlogba
nlogba = nlog21 = n0 = 1
O(nlogba)
O(nlogba log(n))
O(f(n))
Now, let’s combine everything we learned here to get the
running time of our binary search function indexOf.
Thus,
O(nlogba log(n))
O(n0 log(n))
Mergesort
What’s the best way to sort an array? Before, we proposed a
solution using bubble sort that has a time complexity of
O(n2). Can we do better?
1/**
2 * Sort array in asc order using merge-sort
3 * @example
4 * sort([3, 2, 1]) => [1, 2, 3]
5 * sort([3]) => [3]
6 * sort([3, 2]) => [2, 3]
7 * @param {array} array
8 */
9function sort(array = []) {
10 const size = array.length;
11 // base case
12 if (size < 2) {
13 return array;
14 }
15 if (size === 2) {
16 return array[0] > array[1] ? [array[1], array[0]] : array;
17 }
18 // slit and merge
19 const mid = parseInt(size / 2, 10);
20 return merge(sort(array.slice(0, mid)), sort(array.slice(mid)));
21}
22
23/**
24 * Merge two arrays in asc order
25 * @example
26 * merge([2,5,9], [1,6,7]) => [1, 2, 5, 6, 7, 9]
27 * @param {array} array1
28 * @param {array} array2
29 * @returns {array} merged arrays in asc order
30 */
31function merge(array1 = [], array2 = []) {
32 const merged = [];
33 let array1Index = 0;
34 let array2Index = 0;
35 // merge elements on a and b in asc order. Run-time O(a + b)
36 while (array1Index < array1.length || array2Index < array2.length) {
37 if (array1Index >= array1.length || array1[array1Index] >
38array2[array2Index]) {
39 merged.push(array2[array2Index]);
40 array2Index += 1;
41 } else {
42 merged.push(array1[array1Index]);
43 array1Index += 1;
}
44
}
45
return merged;
46
}
As you can see, it has two functions sort and merge. Merge is
an auxiliary function that runs once through the
collection a and b, so it’s running time is O(n). Let’s apply
the Master Method to find the running time.
nlog22
n1 = n
O(nlog22 log(n))
O(n1 log(n))
O(n log(n)) 👈 this is running time of the merge sort
Power Set
To understand the power set, let’s imaging you are buying
pizza. The store has many toppings that you can choose
from like pepperoni, mushrooms, bacon, and pinapple. Let’s
call each topping A, B, C, D. What are your choices? You can
select no topping (you are on a diet ;), you can choose one
topping, or two or three or all of them, and so on. The power
set gives you all the possibilities (BTW, there 16 with 4
toppings as you will see later)
As you noticed, every time the input gets longer, the output
is twice as long as the previous one. Let’s code it up:
1powerset('') // ...
2// n = 0, f(n) = 1;
3powerset('a') // , a...
4// n = 1, f(n) = 2;
5powerset('ab') // , a, b, ab...
6// n = 2, f(n) = 4;
7powerset('abc') // , a, b, ab, c, ac, bc, abc...
8// n = 3, f(n) = 8;
9powerset('abcd') // , a, b, ab, c, ac, bc, abc, d, ad, bd, abd, cd, acd, bcd...
10// n = 4, f(n) = 16;
11powerset('abcde') // , a, b, ab, c, ac, bc, abc, d, ad, bd, abd, cd, acd, bcd...
12// n = 5, f(n) = 32;
As expected, if you plot n and f(n), you will notice that it
would be exactly like the function 2^n. This algorithm has a
running time of O(2^n).
5! = 5 x 4 x 3 x 2 x 1 = 120
20! = 2,432,902,008,176,640,000
Permutations of a string.
Solving the traveling salesman problem with a brute-
force search
Permutations
Write a function that computes all the different words that
can be formed given a string. E.g.
Can you try with a permutation with 11 characters? ;) Comment below what
happened to your computer!
// Here c is a constant
for (int i = 1; i <= c; i++) {
// some O(1) expressions
}
2) O(n): Time Complexity of a loop is considered as O(n) if the loop variables is
incremented / decremented by a constant amount. For example following functions
have O(n) time complexity.
// Here c is a positive integer constant
for (int i = 1; i <= n; i += c) {
// some O(1) expressions
}