Running Time Calculation
Running Time Calculation
Running Time Calculation
3) Generally, the total number of steps is roughly proportional to the number of the basic
operations.
4) Thus, we are concerned mainly with the basic operations -
how many times the basic operations have to be performed depending on the size of
input.
The work done by an algorithm, i.e. its complexity, is determined by the number of the basic
operations (Primitive operations) necessary to solve the problem.
Note: The complexity of a program that implements an algorithm is roughly the same,
but not exactly the same as the complexity of the algorithm.
Here we are talking about algorithms independent on any particular implementation -
programming language or computer.
2. Size of input
Some algorithms are not dependent on the size of the input - the number of the operations they
perform is fixed. Other algorithms depend on the size of the input, and these are the algorithms that
might cause problems. Before implementing such an algorithm, we have to be sure that the algorithm
will finish the job in reasonable time.
What is size of input? We need to choose some reasonable measure of size. Here are some examples:
The running time of a for loop is at most the running time of the statements inside the loop
times the number of iterations.
sum = sum + i;
The loop heading plus the loop body will give: O(n) + O(n) = O(n).
Loop running time is: O(n)
sum = 0;
for( i = 0; i < n; i++)
for( j = 0; j < n; j++)
sum++;
Applying Rule 1 for the nested loop (the 'j' loop) we get O(n) for the body of the outer loop.
The outer loop (the 'i' loop) runs n times so we get O(n),
sum++;
Here, the number of the times the inner loop is executed depends on the value of i:
Running time is the product of the size of the loops times the running time of the body.
Example:
sum = 0;
for( i = 0; i < n; i++)
for( j = 0; j < 2n; j++)
sum++;
We have one operation inside the loops, and the product of the sizes is 2n2.
Hence the running time is O(2n2) = O(n2)
Rule 3: Consecutive program fragments
The total running time is the maximum of the running time of the individual fragments
sum = 0;
for( i = 0; i < n; i++)
sum = sum + i;
sum = 0;
for( i = 0; i < n; i++)
sum++;
The first loop runs in O(n) time, the second - O(n2) time, the maximum is O(n2)
Rule 4: If statement
if Condition
S1;
else
S2;
The running time is the maximum of the running times of S1 and S2.
Examples
0. Search in an unordered array of elements.
The basic operation in this problem is comparison, so we are interested in how the number of
comparisons depends on n.
In the worst case (element not there, or located at the end), we have n comparisons to make.
Hence the number of operations is O(n).
To find what is the average case, we have to consider the possible cases:
We compute the sum of the comparisons for the possible cases and then divide by the number
of the cases, assuming that all cases are equally likely to happen:
1 + 2 + … + n = n(n+1)/2
Thus in the average case the number of comparisons would be (n+1)/2 = O(n).
Generally, the algorithm for finding an element in an unsorted array needs O(n) operations.
1. Search in a table n x m
Here the inner loop runs at most m times and it is located in another loop that runs at
most n times, so in total there would be at most nm operations. Strictly speaking, nm is less
than n2 when m < n. However, for very large values of n and m the difference is negligible,
the amount of work done is roughly the same.
Thus we can say that the complexity here is O(n2).
amax = a[0];
for (i = 1; i < n; i++)
if (a[i] > amax)
amax = a[i];
Here the number of operations is always n-1. The amount of work depends on the size of the
input, but does not depend on the particular values.
The running time is O(n), we disregard "-1" because the difference for large n is negligible.
The complexity here is O(n), we disregard "-1" because the difference for large n is
negligible.
Intro to Algorithms
▪ Problem 1
sum = 0;
for( i = 0; i < n; i++)
sum++;
Sol: The running time for the operation sum++ is a constant. The loop runs n times, hence
the complexity of the loop would be O(n)
▪ Problem 2
sum = 0;
for( i = 0; i < n; i++)
▪ Problem 3
sum = 0;
for( i = 0; i < n; i++)
▪ Problem 4
sum = 0;
for( i = 0; i < n; i++)
▪ Problem 5
sum = 0;
for( i = 0; i < n; i++)
▪ Problem 6
sum = 0;
for( i = 0; i < n; i++)
Sol: Compare this problem with Problem 5. Obviously the most inner loop will run less
times than in Problem 5, and a refined analysis is possible, we are usually content to neglect
the if statement and consider its running time to be O(n2), yielding an overall running time
of O(n5)
▪ Problem 7
sum = 0;
for( i = 0; i < n; i++)
sum++;
val = 1;
for( j = 0; j < n*n; j++)
val = val * j;
Sol: First, we asume that the running time to compute an arithmetic expression without
function calls is negligible. Then, we have two consecutive loops with running times O(n)
and O(n2). We take the maximum complexity, hence the overall runing time would be O(n2)
▪ Problem 8
sum = 0;
for( i = 0; i < n; i++)
sum++;
for( j = 0; j < n*n; j++)
compute_val(sum,j);
The complexity of the function compute_val(x,y) is given to be O(nlogn)
Sol: The difference between this problem and the Problem 7 consists in the presence of a
function in the loop body, whose complexity is not a constant - it depends on n and is given
to be O(nlogn)
The second loop runs n*n times, so its complexity would be O(n2 * nlogn) = O(n3logn).
The first loop has less running time - O(n), we take the maximum and conclude that the
overall running time would be O(n3logn)