Python DSA Notes-6
Python DSA Notes-6
All are important but we are mostly concerned about the CPU time. Be careful to
differentiate between:
1. Performance: How much time/memory/disk/etc. is used when a program is run.
This depends on the machine, compiler, etc. as well as the code we write.
2. Complexity: How do the resource requirements of a program or algorithm scale,
i.e. what happens as the size of the problem being solved by the code gets larger.
Note: Complexity affects performance but not vice-versa.
Algorithm Analysis
Algorithm analysis is an important part of computational complexity theory, which
provides theoretical estimation for the required resources of an algorithm to solve
a specific computational problem. Analysis of algorithms is the determination of the
amount of time and space resources required to execute it.
1
Fahad sayyed – notes DSA 6
Linked in @ fahad sayyed
Types of Analysis
To analyze a given algorithm, we need to know, with which inputs the algorithm
takes less time (i.e. the algorithm performs well) and with which inputs the
algorithm takes a long time.
Three types of analysis are generally performed:
• Worst-Case Analysis: The worst-case consists of the input for which the
algorithm takes the longest time to complete its execution.
• Best Case Analysis: The best case consists of the input for which the algorithm
takes the least time to complete its execution.
• Average case: The average case gives an idea about the average running time of
the given algorithm.
2
Fahad sayyed – notes DSA 6
Linked in @ fahad sayyed
Big-O notation
We can express algorithmic complexity using the big-O notation. For a problem of
size N:
Definition: Let g and f be functions from the set of natural numbers to itself. The
function f is said to be O(g) (read big-oh of g), if there is a constant c and a natural
n0 such that f (n) ≤ cg(n) for all n > n0.
Note: O(g) is a set!
Abuse of notation: f = O(g) does not mean f ∈ O(g).
Examples:
Although we can include constants within the big-O notation, there is no reason to
do that. Thus, we can write O(5n + 4) = O(n).
Note: The big-O expressions do not have constants or low-order terms. This is
because, when N gets large enough, constants and low-order terms don't matter (a
constant-time function/method will be faster than a linear-time function/method,
which will be faster than a quadratic-time function/method).
3
Fahad sayyed – notes DSA 6
Linked in @ fahad sayyed
statement 1;
statement 2;
...
statement k;
The total time is found by adding the times for all statements:
2. if-else statements
if (condition):
#sequence of statements 1
else:
#sequence of statements 2
Here, either sequence 1 will execute, or sequence 2 will execute. Therefore, the
worst-case time is the slowest of the two possibilities:
max(time(sequence 1), time(sequence 2))
For example, if sequence 1 is O(N) and sequence 2 is O(1) the worst-case time for
the whole if-then-else statement would be O(N).
3. for loops
for i in range N:
#sequence of statements
Here, the loop executes N times, so the sequence of statements also executes N
times. Now, assume that all the statements are of the order of O(1), then the total
time for the for loop is N * O(1), which is O(N) overall.
4
Fahad sayyed – notes DSA 6
Linked in @ fahad sayyed
4. Nested loops
for i in range N:
for i in range M:
#statements
The outer loop executes N times. Every time the outer loop executes, the inner loop
executes M times. As a result, the statements in the inner loop execute a total of N
* M times. Assuming the complexity of the statement inside the inner loop to be
O(1), the overall complexity will be O(N * M).
Sample Problem:
What will be the Time Complexity of following while loop in terms of ‘N’ ?
while N>0:
N = N//8
Iteration Value of N
Number
1 N
2 N//8
3 N//64
... ...
k N//8k
We know, that in the last i.e. the kth iteration, the value of N would become 1, thus,
we can write:
5
Fahad sayyed – notes DSA 6
Linked in @ fahad sayyed
N//8k = 1
=> N = 8k
=> log(N) = log(8k)
=> k*log(8) = log(N)
=> k = log(N)/log(8)
=> k = log8(N)
Now, clearly the number of iterations in this example is coming out to be of the
order of log8(N). Thus, the time complexity of the above while loop will be
O(log8(N)).
Qualitatively, we can say that after every iteration, we divide the given number by 8,
and we go on dividing like that, till the number remains greater than 0. This gives
the number of iterations as O(log8(N)).
Linear Search
● The element being searched may be present in the last position or may not
present in the array at all.
● In the former case, the search terminates in success with N comparisons.
● In the latter case, the search terminates in failure with N comparisons.
● Thus in the worst case, the linear search algorithm takes O(N) operations.
6
Fahad sayyed – notes DSA 6
Linked in @ fahad sayyed
Binary Search
Binary Search time complexity analysis is done below-
● In each iteration or each recursive call, the search gets reduced to half of the
array.
● So for N elements in the array, there are log2N iterations or recursive calls.
Thus, we have-