Chapter 2
Chapter 2
1
Algorithm analysis refers to the process of
determining how much computing time and
storage that algorithms will require.
2
In order to solve a problem, there are many
possible algorithms.
3
The main resources are:
• Running Time
• Memory Usage
• Communication Bandwidth
Note:
Running time is the most important since
computational time is the most precious
resource in most problem domains.
4
There are two approaches to measure the
efficiency of algorithms:
1. Empirical
based on the total running time of the
program.
Uses actual system clock time.
Example:
t1
for(int i=0; i<=10; i++)
cout<<i;
t2
Running time taken by the above algorithm
(TotalTime) = t2-t1;
5
It is difficult to determine efficiency of
algorithms using this approach, because clock-
time can vary based on many factors.
For example:
a) Processor speed of the computer
1.78GHz 2.12GHz
10s <10s
b) Current processor load
Only the work 10s
With printing 15s
With printing & browsing the internet >15s
6
c) Specific data for a particular run of the
program
Input size
Input properties
t1
for(int i=0; i<=n; i++)
cout<<i;
t2
T=t2-t1;
For n=100, T>=0.5s
n=1000, T>0.5s
7
d) Operating System
Multitasking Vs Single tasking
Internal structure
8
2. Theoretical
9
We use theoretical approach to determine
the efficiency of algorithm because:
10
Complexity Analysis is the systematic study
of the cost of computation, measured
either
in:
Time units
Operations performed, or
The amount of storage space required.
11
Two important ways to characterize the
effectiveness of an algorithm are its Space
Complexity and Time Complexity.
13
Complexity analysis involves two distinct phases:
• Algorithm Analysis: Analysis of the algorithm or
data structure to produce a function T(n) that
describes the algorithm in terms of the operations
performed in order to measure the complexity of the
algorithm.
Example: Suppose we have hardware capable of
executing 106 instructions per second. How long would
it take to execute an algorithm whose complexity
function is T(n)=2n2 on an input size of n=108?
Solution: T(n)= 2n2=2(108)2 = 2*1016
Running time=T(108)/106=2*1016/106= seconds. 2*1010
14
There is no generally accepted set of rules
for algorithm analysis.
However, an exact count of operations is
commonly used.
To count the number of operations we can
use the following Analysis Rule.
Analysis Rules:
1. Assume an arbitrary time unit.
2. Execution of one of the following operations
takes time 1 unit:
Assignment Operation
Example: i=0;
Single Input/Output Operation
Example: cin>>a;
cout<<“hello”;
15
Single Boolean Operations
Example: i>=10
Single Arithmetic Operations
Example: a+b;
Function Return
Example: return sum;
3. Running time of a selection statement (if,
switch) is the time for the condition
evaluation plus the maximum of the running
times for the individual clauses in the
selection.
16
Example: int x;
int sum=0;
if(a>b)
{
sum= a+b;
cout<<sum;
}
else
{
cout<<b;
}
T(n) = 1 +1+max(3,1)
=5
17
4. Loop statements:
• The running time for the statements inside
the loop * number of iterations + time for
setup(1) + time for checking (number of
iteration + 1) + time for update (number of
iteration)
• The total running time of statements inside a
group of nested loops is the running time of
the statements * the product of the sizes of
all the loops.
• For nested loops, analyze inside out.
• Always assume that the loop executes the
maximum number of iterations possible.
(Why?)
Because we are interested in the worst case
complexity.
18
5. Function call:
• 1 for setup + the time for any parameter
calculations + the time required for the
execution of the function body.
Examples:
1)
int k=0,n;
cout<<“Enter an integer”;
cin>>n
for(int i=0;i<n; i++)
k++;
T(n)= 3+1+n+1+n+n=3n+5
19
2)
int i=0;
while(i<n)
{
cout<<i;
i++;
}
int j=1;
while(j<=10)
{
cout<<j;
j++;
}
T(n)=1+n+1+n+n+1+11+2(10)
= 3n+34
20
3)
int k=0;
for(int i=1 ; i<=n; i++)
for( int j=1; j<=n; j++)
k++;
T(n)=1+1+(n+1)+n+n(1+(n+1)+n+n)
= 2n+3+n(3n+2)
= 2n+3+3n2+2n
= 3n2+4n+3
21
4). int sum=0;
for(i=1;i<=n;i++))
sum=sum+i;
T(n)=1+1+(n+1)+n+(1+1)n
=3+4n=O(n)
22
6). void func( ){
int x=0; int i=0; int j=1;
cout<<”Enter a number”;
cin>>n;
while(i<n){
i=i+1;
}
while(j<n){
j=j+1;
}
}
T(n)=1+1+1+1+1+n+1+2n+n+2(n-1)
= 6+4n+2n-2
=4+6n=O(n)
23
7). int sum(int n){
int s=0;
for(int i=1;i<=n;i++)
s=s+(i*i*i*i);
return s;
}
T(n)=1+(1+n+1+n+5n)+1
=7n+4=O(n)
8). int sum=0;
for(i=0;i<n;i++)
for(j=0;j<n;j++)
sum++;
T(n)=1+1+(n+1)+n+n*(1+
(n+1)+n+n)
=3+2n+n2+2n+2n2
=3+2n+3n2+2n
=3n2+4n+3=O(n2)
24
In the above examples we have seen that
analyzing Loop statements is so complex.
25
The index and bounds of the summation are
the same as the index and bounds of the for
loop.
Suppose we count the number of additions
N
f
or(
in
ti=1
;i<
=N;i
++){
}
s
um =
su
m +
i
;
1N
i1
26
Nested Loops: Formally
Nested for loops translate into multiple
}
sum= sum+i+j; 2 2M 2MN
i1 j1 i1
}
27
Consecutive Statements: Formally
Add the running times of the separate
blocks of your code.
if (test ==1) {
for (inti =1; i <=N; i++) { N N N
sum=sum+i; 1, 2
max
}}
i1 i1 j1
elsefor (inti =1; i <=N; i++) {
for (int j =1; j <=N; j++) { max 2
N, 2N 2N 2
sum=sum+i+j;
}}
29
Categories of Algorithm
Analysis
Algorithms may be examined under different
situations to correctly determine their
efficiency for accurate comparison.
Best Case Analysis:
Assumes the input data are arranged in the
statements.
30
Computes the lower bound of T(n), where
T(n) is the complexity function.
Examples:
For sorting algorithm
If the list is already sorted (data are arranged in
the required order).
For searching algorithm
If the desired item is located at first accessed
position.
31
Worst Case Analysis:
Assumes the input data are arranged in the most
disadvantageous order for the algorithm.
Takes the worst possible set of inputs.
Causes execution of the largest number of
statements.
Computes the upper bound of T(n) where T(n) is
the complexity function.
◦ Example: While sorting, if the list is in opposite
order. While searching, if the desired item is
located at the last position or is missing.
Examples:
For sorting algorithms
If the list is in opposite order.
For searching algorithms
If the desired item is located at the last
position or is missing.
32
Worst Case Analysis:
Worst case analysis is the most common analysis
because:
It provides the upper bound for all input (even for bad
ones).
Average case analysis is often difficult to determine
and define.
If situations are in their best case, no need to develop
algorithms because data arrangements are in the
best situation.
Best case analysis can not be used to estimate
complexity.
We are interested in the worst case time since it
provides a bound for all input-this is called the “Big-
Oh” estimate.
33
Average Case Analysis:
Determine the average of the running time overall
permutation of input data.
Takes an average set of inputs.
It also assumes random input size.
It causes average number of executions.
Computes the optimal bound of T(n) where T(n)
is the complexity function.
Sometimes average cases are as bad as worst
cases and as good as best cases.
Examples:
For sorting algorithms
While sorting, considering any arrangement (order of
input data).
For searching algorithms
While searching, if the desired item is located at any
location or is missing.
34
The study of algorithms includes:
◦ How to Design algorithms (Describing
algorithms)
◦ How to Analyze algorithms (In terms of
time and memory space)
◦ How to validate algorithms (for any input)
◦ How to express algorithms (Using
programming language)
◦ How to test a program (debugging and
maintaining)
35
Order of Magnitude
Refers to the rate at which the storage or
time
grows as a function of problem size.
1. Big-Oh Notation
40
Question-2
41
2. Big-Omega ()-Notation (Lower bound)
Example:
Find g(n) such that f(n) = (g(n)) for f(n)=3n+5
g(n) = √n, c=1, k=1.
f(n)=3n+5=(√n)
42
Big-Omega ()-Notation (Lower bound)
43
3. Theta Notation (-Notation) (Optimal
bound)
45
4. Little-oh (small-oh) Notation
Definition: We say f(n)=o(g(n)), if there are
positive constants no and c such that to the right of
no, the value of f(n) lies below c.g(n).
As n increases, g(n) grows strictly faster than f(n).
Describes the worst case analysis.
Denotes an upper bound that is not
asymptotically tight.
Big O-Notation denotes an upper bound that may or
may not be asymptotically tight.
Example:
Find g(n) such that f(n) = o(g(n)) for f(n) = n2
46
5. Little-Omega () notation
48
Rule 1:
If T1(n)=O(f(n)) and T2(n)=O(g(n)), then
a) T1(n)+T2(n)=max(O(f(n)),O(g(n))),
b) T1(n)*T2(n)=O(f(n)*g(n))
Rule 2:
If T(n) is a polynomial of degree k, then T(n)=(nk).
Rule 3:
logk n=O(n) for any constant k. This tells us that
logarithms grow very slowly.
49
◦ The limit is 0: This means that f(n)=o(g(n)).
◦ The limit is c≠0: This means that f(n)=(g(n)).
◦ The limit is infinity: This means that g(n)=o(f(n)).
◦ The limit oscillates: This means that there is no
relation between f(n) and g(n).
Example:
n3 grows faster than n2, so we can say that
n2=O(n3) or n3=(n2).
f(n)=n2 and g(n)=2n2 grow at the same
rate, so both f(n)=O(g(n)) and f(n)=(g(n))
are true.
If f(n)=2n2, f(n)=O(n4), f(n)=O(n3), and
f(n)=O(n2) are all correct, but the last
option is the best answer.
50
T(n) Complexity Big-O
Category
functions F(n)
c, c is constant 1 C=O(1)
7n!+2n+n2+1 n! T(n)=O(n!)