analysisofalgorithms
analysisofalgorithms
ANALYSIS OF ALGORITHMS
What is an algorithm ?
• Algorithm is a finite set of instructions that if followed accomplished a
specific task.
• An algorithm must satisfies the following criteria.
1. Input -Must take zero or more input.
2. Output- Must give one or more output
3. Definiteness -each instruction is clear and unambiguous
4. Finiteness - algorithm terminates after a finite number of
steps.
5. Effectiveness -every instruction must be basic i.e. simple
instruction
What is an algorithm ?
• An algorithm is a sequence of
unambiguous instructions for
solving a problem, i.e., for obtaining
a required output for any
legitimate input in a finite
amount of time.
Problem, Algorithm, Program
• For each problem or class of problems, there may be many different
algorithms.
• For each algorithm, there may be many different implementations
(programs).
Efficiency of Algorithms
“Analysis of algorithms” mean an investigation of an
algorithm’s efficiency with respect to two resources:
Running time and Memory space.
Analysis can be done at two different stages
Apriori Analysis
Posteriori Analysis
• Techniques are:-
• 1) Aggregate Method
• 2) Accounting Method
• 3) Potential Method
TIME COMPLEXITY OF SIMPLE ALGORITHMS
{ 0 0 0
S=0 1 1 1
1 n+1 n+1
for i= 1 to n
s=s+a[i] 1 n n
Return s 1 1 1
}
0 0 0
Total 2n+3
Eg : Tabular method.
1 1 1 1 1 Total = 2 ; n=0
If (n<=0) 2+TRsum(n-1) ;
1 1 0 1 0 n>0
return 0
Else 0 0 0 0 0
{ 0 0 0 0 0
Total 2 2+x
TIME COMPLEXITY OF ITERATIVE ALGORITHMS
Total 2mn+2m+1
TIME COMPLEXITY OF ITERATIVE ALGORITHMS
{ 0 0 0
C[I,j]=0;
1 n(n+1) n(n+1)
1 n*n*n n*n*n
C[I,j]=A[i.j]+B[i.j]
}
Total
Analysis of Algorithms - Asymptotic
Analysis
• In Asymptotic Analysis, we evaluate the performance
of an algorithm in terms of input size.
Eg: f(n) = 3n +2
g(n) = n2
Little omega(ω) Notation:
2. Reflexivity
f(n)= O(f(n))
f(n)= Ω (f(n))
f(n)=θ(f(n))
f(n)= 3n2 + 2
PROPERTIES OF ASYMPTOTIC NOTATIONS
3. Transitivity
f(n)= O(g(n)) and g(n)= O(h(n)) imply f(n)=O(h(n))
f(n)= Ω (g(n)) and g(n)= Ω (h(n)) imply f(n)= Ω (h(n))
f(n)=θ(g(n)) and g(n)=θ(h(n)) imply f(n)=θ(h(n))
f(n)= o(g(n)) and g(n)= o(h(n)) imply f(n)=o(h(n))
f(n)= ω(g(n)) and g(n)= ω(h(n)) imply f(n)= ω(h(n))
Transpose symmetry
log log n Double logarithmic growth increases very slowly (Interpolation search)
Between two algorithms it is considered that the one having a smaller order of
growth is more efficient (true only for large input size).
1 < log n < n1/2 < n < n log n < n2 < n3< 2n
TIME EFFICIENCY OF RECURSIVE ALGORITHMS.
1. Iteration Method
k = log4 n
Assume 4k = n
k = log4 n
Total Cost = cn2 + 3 c(n/4)2 + 9 c(n/16)2 +
27c(n/64)2 + ......+3k c(n/4k)2
Total Cost = cn2 + 3 c(n/4)2 +=9cnc(n/16)
2
( 30/1620++27c(n/64) 2
3 1/16 1 +3 + ......+3k
c(n/4k)2 2
/162 + 33/163 + ... +3k/16k)
= cn2( 30/160 + 3=1/16 1
cn2 { +3 2
/162 +
(1-(3/16) k+133/163 + ...
) /(1 -
+3k/16k) 3/16) }
= cn2 { (1-(3/16)=k+1 cn) 2/(1{ -(1-
3/16) } (k+1))/
0.1875
0.8125}
= cn2 ;{ (1-
0.1875 ) tends
(k+1)
0.1875 (k+1)to zero when n ->infinity
)/ 0.8125} ; 0.1875 (k+1)
) tends to
T(n) = O (n2)