DAA - UNIT-1 Modified
DAA - UNIT-1 Modified
UNIT – I
ALGORITHM
Informal Definition:
An Algorithm is any well-defined computational procedure that takes
some value or set of values as Input and produces a set of values or some value as
output. Thus algorithm is a sequence of computational steps that transforms the i/p
into the o/p.
Formal Definition:
An Algorithm is a finite set of instructions that, if followed,
accomplishes a particular task.
All algorithms should satisfy the following criteria.
The study of Algorithms includes many important and active areas of research.
There are four distinct areas of study one can identify
1. How to device algorithms-
Creating an algorithm is an art which many never fully automated. A major goal
is to study various design techniques that have proven to be useful. By mastering
-1-
Design and Analysis of Algorithms
these design strategies, it will become easier for you to device new and useful
algorithms. some of techniques may already be familiar, and some have been found
to be useful. Dynamic programming is one technique. Some of the techniques are
especially useful in fields other than computer science such as operations research
and electrical engineering.
2. How to validate algorithms:
Once an algorithm is devised, it is necessary to show that it computes the correct
answer for all possible legal inputs. We refer to this process as algorithm
validation. The algorithm need not as yet be expressed as a program. The purpose
of validation is to assure us that this algorithm will work correctly independently.
Once the validity of the method has been shown, a program can be written and a
second phase begins. This phase is referred to as program proving or sometimes as
program verification.
A proof of correctness requires that the solution be stated in two forms. One form
is usually as a program which is annotated by a set of assertions about the input
and output variables of the program. These assertions are often expressed in the
predicate calculus. The second form is called a specification, and this may also be
expressed in the predicate calculus. A complete proof of program correctness
requires that each statement of a programming language be precisely defined and
all basic operations be proved correct.
3. How to analyze algorithms:
As an algorithm is executed, it uses the computer's central processing unit
(CPU) to perform operations and its memory to hold the program and data.
Analysis of algorithms or performance analysis refers to the task of determining
how much computing time and storage algorithms replace.we analyze the
algorithm based on time and space complexity.The amount of time neede to run the
-2-
Design and Analysis of Algorithms
Algorithm Specification:
This method will work well when the algorithm is small& simple.
3. Pseudo-code Method:
-3-
Design and Analysis of Algorithms
3. An identifier begins with a letter. The data types of variables are not
explicitly declared.
Here link is a pointer to the record type node. Individual data items of
a record can be accessed with and period.
-4-
Design and Analysis of Algorithms
<statement-n>
}
For Loop:
For variable: = value-1 to value-2 step step do
{
<statement-1>
.
.
.
<statement-n>
}
repeat-until:
repeat
<statement-1>
.
.
.
<statement-n>
until<condition>
Case statement:
Case
{
: <condition-1> : <statement-1>
.
.
.
: <condition-n> : <statement-n>
-5-
Design and Analysis of Algorithms
: else : <statement-n+1>
}
9. Input and output are done using the instructions read & write.
Examples:
algorithm Max(A,n)
// A is an array of size n
{
Result := A[1];
for I:= 2 to n do
if A[I] > Result then
Result :=A[I];
return Result;
}
.
Algorithm for Selection Sort:
-6-
Design and Analysis of Algorithms
Recursive Algorithms:
1. Towers of Hanoi:
.
.
.
-7-
Design and Analysis of Algorithms
Consider there are three towers A, B, C and there will be three disks present
in tower A. Consider C as destination tower and B as intermediate tower. The
steps involved during moving the disks from A to B are
Step 1: Move the smaller disk which is present at the top of the tower
A to C.
Step 2: Then move the next smallest disk present at the top of the tower A to
B.
Step 3: Now move the smallest disk present at tower C to tower B
Step 4: Now move the largest disk present at tower A to tower C
Step 5: Move the disk smallest disk present at the top of the tower B
to tower A.
Step 6: Move the disk present at tower B to tower C.
Step 7: Move the smallest disk present at tower A to tower C
In this way disks are moved from source tower to destination tower.
t(n)=1; if n=0
=2t(n-1)+2 if n>=1
-8-
Design and Analysis of Algorithms
Solve the above recurrence relation then the time complexity of towers of Hanoi is
O(2^n)
-9-
Design and Analysis of Algorithms
Performance Analysis:
1. Space Complexity:
The space complexity of an algorithm is the amount of memory it
needs to run to compilation.
2. Time Complexity:
The time complexity of an algorithm is the amount of computer
time it needs to run to compilation.
Space Complexity:
The Space needed by each of these algorithms is seen to be the sum of the
following component.
Example 1:
Algorithm abc(a,b,c)
{
return a+b++*c+(a+b-c)/(a+b) +4.0;
}
In this algorithm sp=0;let assume each variable occupies one word.
Then the space occupied by above algorithm is >=3.
S(P)>=3
- 10 -
Design and Analysis of Algorithms
Example 2:
Algorithm sum(a,n)
{
s=0.0;
for I=1 to n do
s= s+a[I];
return s;
}
In the above algoritm n,s and occupies one word each and array „a‟
occupies n number of words so S(P)>=n+3
Example 3:
In the above recursion algorithm the space need for the values of n, return
address and pointer to array. The above recursive algorithm depth is (n+1). To each
recursive call we require space for values of n, return address and pointer to array.
So the total space occupied by the above algorithm is S(P) >= 3(n+1)
Time Complexity:
The time T(p) taken by a program P is the sum of the compile time
and the run time(execution time)
The compile time does not depend on the instance characteristics. Also
we may assume that a compiled program will be run several times without
recompilation .This rum time is denoted by tp(instance characteristics).
- 11 -
Design and Analysis of Algorithms
Example1:
Algorithm:
Algorithm sum(a,n)
{
s= 0.0;
count = count+1;
for I=1 to n do
{
count =count+1;
s=s+a[I];
count=count+1;
}
count=count+1;
count=count+1;
return s;
}
- 12 -
Design and Analysis of Algorithms
Algorithm RSum(a,n)
{
if(n<=0)then
return 0.0;
else
return RSum(a,n-1)+a[n];
- 13 -
Design and Analysis of Algorithms
Example3:
Algorithm Add(a,b,c,m,n)
for i:=1 to m do
for j:=1 to n do
c[i,j]=a[i,j]+b[i,j];
- 14 -
Design and Analysis of Algorithms
First determine the number of steps per execution (s/e) of the statement
and the
total number of times (ie., frequency) each statement is executed.
By combining these two quantities, the total contribution of all statements,
the step count for the entire algorithm is obtained.
Example 1:
1. Algorithm Sum(a,n) 0 - 0
2.{ 0 - 0
3. S=0.0; 1 1 1
4. for I=1 to n do 1 n+1 n+1
5. s=s+a[I]; 1 n n
6. return s; 1 1 1
7. } 0 - 0
Total 2n+3
Example 2:
- 15 -
Design and Analysis of Algorithms
0
2 {
3 if(n<=0) then 1 1 1
1 1
4 return 0.0; 1 1 1
0 0
5 else return
6 Rsum(a,n-1)+a[n]; 1+x 0 0
1 1+x
7 } 0_ 0
_ 0
Total 2
2+x
- 16 -
Design and Analysis of Algorithms
Example 3:
Example 4:
Algorithm Fibonacci(n)
if(n<=1) then
write (n);
else
fnm2:=0;
fnm1:=1;
for i:=2 to n do
fn:=fnm1+fnm2;
- 17 -
Design and Analysis of Algorithms
fnm:=fnm1;
fnm1:=fn;
write(fn);
- 18 -
Design and Analysis of Algorithms
Asymptotic Notations:
- 19 -
Design and Analysis of Algorithms
Figure 1
example : consider f(n)=2n+3 and g(n)=n^2
Sol : f(n)<=c*g(n)
let us assuming as c=1,
then f(n)<=g(n)
if n=1,
2n+3<=n^2 = 2(1)+3<=1^2 =>5<=1(false)
If n=2,
2n+3<=n^2=2(2)+3<=2^2= 7<=4(false)
if n=3,
2n+3<=n^2= 2(3)+3<=3^2=9<=9 (true)
if n=4,
2n+3<=n^2=>2(4)+3<=4^2=11<=6 (true)
if n=5,
2n+3<=n^2=2(5)+3<=5^2=13<=25 (true)
If n=6,2n+3<=n^2=2(6)+3<=6^2=15<=36 (true)
.:n>=3, f(n)=O(n^2) i.e, f(n)=O(g(n))
- 20 -
Design and Analysis of Algorithms
Definition : Let f(n),g(n) are two non-negative functions. If there exists two
positive constants c,n0.such that c>0 and for all n>=n0.if f(n)>=c*g(n) then we
say that f(n)=Ω(g(n))
T HE GRAPH FOR O MEGA NOTATION :
- 21 -
Design and Analysis of Algorithms
- 22 -
Design and Analysis of Algorithms
If n=3,
3n<=2n+5=>3(3)<=2(3)+5=>9<=11 (true)
c2=4 c2*g(n)=4n
if n=1,
2n+5<=4n=>2(1)+5<=4(1)=>7<=4
If n=2,
2n+5<=4n=>2(2)+5<=4(2)=>9<=8
If n=3,
2n+5<=4n=>2(3)+5<=4(3)=>11<=12 (true)
If n=4,
2n+5<=4n=>2(4)+5<=4(4)=>13<=16 (true)
for all .:n>=3 f(n)=Θ(n) f(n)= Θ (g(n))
4:LITTLE-OH ( O ) NOTATION:
Definition : Let f(n),g(n) are two non-negative functions
if lim [f(n) / g(n)] = 0 then we say that f(n)=o(g(n))
n
- 23 -
Design and Analysis of Algorithms
lim g(n)/f(n) = 0
(n->)
=lim (2n+5) /(n^2)
(n->)
=lim n(2+(5/n)) / (n^2)
(n->)
=lim (2+(5/n)) / n =2/=0
(n->)
.:f(n)= ω(n).
Amortized analysis:
Amortized analysis means finding average running time per operation over a
worst case sequence of operations.
Suppose a sequence I1,I2,D1,I3,I4,I5,I6,D2,I7 of insert and delete operations
is performed on a set.
Assume that the actual cost of each of the seven inserts is one and for
delete operations D1 and D2 have an actual cost of 8 and 10 so the total
cost of sequence of operations is 25.
In amortized scheme we charge some of the actual cost of an operation to
other operations. This reduce the charge cost of some operations and
increases the cost of other operations. The amortized cost of an operation is
the total cost charge to it.
The only requirement is that the some of the amortized complexities of all
operations in any sequence of operations be greater than or equal to their
some of actual complexities i.e.,
(1)
Where amortized( i ) and actual( i ) denote the amortized and actual
complexities of the ith operations in a sequence on n operations.
To define the potential function p(i) as:
- 24 -
Design and Analysis of Algorithms
P (n)-p (0) =
From equation (1) we say that
P (n)-p (0) ≥0 (3)
Under assumption p(0)=0,p(i) is the amount by which the first „i‟ operations
have been over charged (i.e., they have been charged more than the actual
cost).
The methods to find amortized cost for operations are:
1. Aggregate method.
2. Accounting method.
3. Potential method.
1. Aggregate method:
The amortized cost of each operation is set equal to Upper Bound On Sum
Of Actual Costs(n)/n.
2. Accounting method:
In this method we assign amortized cost to the operations (possibly by
guessing what assignment will work),compute the p(i) using equation(2) and
show that p(n)-p(0)>=0.
3.Potential method:
Here we start with potential function that satisfies equation(3) and
compute amortized complexities using equation(2).
- 25 -
Design and Analysis of Algorithms
Example:
Let assume we pay $50 for each month other than March, June, September,
and December $100 for every June, September. calculate cost by using
aggregate, accounting and potential method .
Month 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
Accounting method:
From the above table we see that using any cost less than $75 will result
in p(n)-p(0)≤0.
The amortized cost must be ≥ 75.
If the amortized cost ≤ 75 then only the condition p(n)-p(0)<=0.
Potential method:
To the given problem we start with the potential function as:
P (n) =0 n mod 12=0
P (n) =25 n mod 12=1 or 3
- 26 -
Design and Analysis of Algorithms
Probabilistic analysis:
- 27 -