0% found this document useful (0 votes)
26 views32 pages

Eedy Method

The document discusses the greedy method approach for solving optimization problems. It explains the concepts of feasible and optimal solutions and describes how the greedy method works by making locally optimal choices at each step. Two examples of problems solved using greedy method are also presented - Huffman coding for data compression and the 0-1 knapsack problem.

Uploaded by

merimarji6265
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views32 pages

Eedy Method

The document discusses the greedy method approach for solving optimization problems. It explains the concepts of feasible and optimal solutions and describes how the greedy method works by making locally optimal choices at each step. Two examples of problems solved using greedy method are also presented - Huffman coding for data compression and the 0-1 knapsack problem.

Uploaded by

merimarji6265
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 32

Unit III : Greedy Method

Unit 3 - Greedy Method


3.1 Introduction
Greedy Method is the simplest design technique. The problems have N inputs require
us to obtain a subset that satisfies few constraints. Any subset that satisfies these
constraints is called a feasible solution. We need to find a feasible solution that either
maximizes or minimizes the objective function which is called as an optimal solution.

The greedy method is a simple strategy of progressively building up a solution, one


element at a time, by choosing the best possible element at each stage. At each stage, a
decision is made regarding whether or not a particular input is in an optimal solution.
This is done by considering the inputs in an order determined by some selection
procedure. If the inclusion of the next input, into the partially constructed optimal
solution will result in an infeasible solution then this input is not added to the partial
solution. The selection procedure is based on some optimization measure. Most of
them, however, will result in algorithms that generate sub-optimal solutions. This
version of greedy technique is called subset paradigm.

Some problems like Knapsack, Job sequencing with deadlines and minimum cost
spanning trees are based on subset paradigm. For the problems that make decisions by
considering the inputs in some order, each decision is made using an optimization
criterion that can be computed using decisions already made. This version of greedy
method is ordering paradigm. Some problems like optimal storage on tapes, optimal
merge patterns and single source shortest path are based on ordering paradigm.

The Greedy method is used for solving optimization problems. Every greedy method
based problem will be given a set of inputs and constraints. The objective is to find a
solution vector, which satisfies the constraint. Some of the important concepts of
greedy method are:
Feasible solution: Any subset of the original input that satisfies a given set of
constraints is called a feasible solution.

Objective solution: This is an input for which a feasible solution is to be obtained


that either maximizes or minimizes

Optimal solution: An optimal solution is one, which maximizes or minimizes the


given objective function.

3.2 Control Abstraction of Greedy method


Algorithm Greedy(a,n)
{
a[n] //array a with n inputs
solution := 0 //initialize the solution
for i:=1 to n do
{
x = select (a)

Design and analysis of Algorithms - A Beginners Guide Page 1


Unit III : Greedy Method

if Feasible (solution, x) then


solution:=union(solution, x);
}
return solution;
}
The function select selects an input from a[ ] and removes it. The selected input’s
value is assigned to x.
Feasible is a Boolean-valued function that determines whether x can be included into
the solution vector.
The function union combines x with the solution and updates the objective function.

Note: Greedy technique works by making the decision that seems most promising at
any moment .This technique will never reconsider this decision, irrespective of
situation later.

3.1Huffman Codes
File compression is another application of Greedy algorithm .If we have a file only
with characters a, e, i, s, t, spaces and new lines, the frequency of appearance of A= 10,
E= 15, I=12, I=3, T=4, Space=13 and Newline=1. By using a standard coding scheme,
for 58 characters using 3 bits for each character, the file requires 174 bits to represent,
as shown below

Using a binary tree representation the binary code for the alphabets are as follows:

The representation of each character can be found by starting at the root and recording
the path. Use a 0 to indicate the left branch and a 1 to indicate the right branch. If the
character ci is at depth di and occurs fi times, the cost of the code is equal to ∑
difi.With this representation the total number of bits is 3x10 + 3x15 + 3x12 + 3x3 +

Design and analysis of Algorithms - A Beginners Guide Page 2


Unit III : Greedy Method

3x4 + 3x13 + 3x1 = 174 .A better code can be obtained by with the following
representation.

The basic problem is to find the full binary tree of minimal total cost. This can be
done by using Huffman coding (1952).
Huffman's Algorithm:
In this algorithm a forest of trees is maintained. The weights of a tree is equal to the
sum of the frequencies of its leaves. If the number of characters is 'c'. c - 1 times,
select the two trees T1 and T2, of smallest weight, and form a new tree with sub-trees
T1 and T2. Repeating the process we will get an optimal Huffman coding tree.

Example:
The initial forest with the weight of each tree is as follows:

Step1: The two trees with the lowest weight are merged together, creating the forest,
the Huffman algorithm after the first merge with new root T1 is as follows:

The total weight of the new tree is the sum of the weights of the old trees,S=3 and
nl=1,total for T1=4
Step 2: Next select the two trees of smallest weight, T1 and t, which are merged into a
new tree with root T2 and weight 8.

Step 3: In next step we merge T2 and a creating T3, with weight 10+8=18.

Design and analysis of Algorithms - A Beginners Guide Page 3


Unit III : Greedy Method

Step 4: Now the two trees of lowest weight are the single node trees representing i and
the blank space. These two trees merged into the new tree with root T4 with weight 25.

Step 5: Again merge the trees with roots e and T3 of lowest weights resulting in a tree
with total weight 33.

Step 6: Lastly the optimal tree is obtained by merging the two remaining trees T4 and
T5. The optimal trees with root T6 is

Design and analysis of Algorithms - A Beginners Guide Page 4


Unit III : Greedy Method

Now traverse the tree from the root and find the code representation of each
character.For example for the character ‘a’ the code is 001,for space it is 11 , for
newline it is 00001 and so on.The full binary tree of minimal total cost, where all
characters are obtained in the leaves, uses only 146 bits as shown below

Example 2:
Construct a Huffman tree by using the below frequency table.

Value A B C D E F
Frequency 5 25 7 15 4 12

Step 1: According to the Huffman coding we arrange the values in ascending order of the
frequencies.

Value E A C F D B
Frequency 4 5 7 12 15 25

Step 2: Insert first two elements which have smaller frequency character E with frequency 4
and character A with frequency 5.

Design and analysis of Algorithms - A Beginners Guide Page 5


Unit III : Greedy Method

Value C EA F D B
Frequency 7 9 12 15 25

Step 3: Taking next smaller number and insert it at correct place.

Value F D CEA B
Frequency 12 15 16 25

Step 4: Next elements are F and D so we construct another subtree for F and D.

Design and analysis of Algorithms - A Beginners Guide Page 6


Unit III : Greedy Method

Value CEA B FD
Frequency 16 25 27

Step 5: Taking next value having smaller frequency then add it with CEA and insert it
at correct
place.

Value FD CEAB
Frequency 27 41

Step 6: We have only two values hence we can combined by adding them.

Design and analysis of Algorithms - A Beginners Guide Page 7


Unit III : Greedy Method

Now traverse the above Huffman tree from the root and find the code representation of each
character and the total number of bits required

Value A B C D E F
Frequency 5 25 7 15 4 12
Code 1011 11 100 01 1010 00
Total bits 20 50 21 30 16 24
Total bits required is =161

3.2 Knapsack problem


This problem is commonly known as the knapsack or the rucksack problem. There are
different kinds of items and each item has a weight and profit value associated with
it. .The problem is ti pick up items with the limitation of maximum weight the
knapsack can hold and maximizing the profit.
There are different kind of knapsack problems:

1. 0-1 Knapsack Problem → In this type of knapsack problem, there is only one
item of each kind (or we can pick only one). So, we are available with only two
options for each item, either pick it (1) or leave it (0) i.e., xi∈{0,1}xi∈{0,1}.
2. Bounded Knapsack Problem (BKP) → In this case, the quantity of each item
can exceed 1 but can't be infinitely present i.e., there is an upper bound on it.
So, 0≤xi≤c0≤xi≤c, where c is the maximum quantity of ii we can take.
3. Unbounded Knapsack Problem (UKP) → Here, there is no limitation on the
quantity of a specific item we can take i.e., xi≥0xi≥0.
4. Integer Knapsack Problem → When we are not available to just pick a part
of an item i.e., we either take the entire item or not and can't just break the item
and take some fraction of it, then it is called integer knapsack problem.

Design and analysis of Algorithms - A Beginners Guide Page 8


Unit III : Greedy Method

5. Fractional Knapsack Problem → Here, we can take even a fraction of any


item.

In this problem we are given a knapsack (bag or a container) of capacity m and n


objects of weight w1, w2…..wn with profit p1, p2 ,….pn.
Objective: Fill the knapsack which maximises the profit earned and the output is a
vector xi where 1 <= i <= n. It is a maximization problem.
Constraints: The total capacity cannot exceed m at any point of time.

The problem can be stated as,

Max ∑ p i xi
1<= i <=n

Subject to ∑ wi xi <= m
1<= i <=n

The main objective is to place the objects into the knapsack so that maximum profit is
obtained and the weight of the object should not exceed the capacity of the knapsack.
The knapsack problem is explained in 2 ways here:
1. Continuous of fractional
2. Discrete or 0/1

3.2.1 Fractional Knapsack problem


This allows the object to be placed in fraction into the knapsack if there is no space for the
entire object.
Algorithm: Greedyknapsack fraction(M, n, w, P, x)
Assume the objects to be arranged in decreasing order of pi / wi. Let M be the
knapsack capacity, n is the number of edges, w [1 : n] is the array of weights, P[1 :
n]is the array of profits ,x is a solution vector.

Step 1: Remaining capacity RC =m


Step 2: Repeat for i=1 to n do
x[i] = 0
end for
Step 3: Repeat for i=1 to n do
if (w[ i ] < RC)
x[ i ]=1
RC=RC=w[ i ]
Step 4: else
x[ i ]=RC/w[ i ]
break
End for
Step 5: For i=1 to n do
sum=sum+p [ i ]*x[ i ]
End for

Design and analysis of Algorithms - A Beginners Guide Page 9


Unit III : Greedy Method

Example1:
For the knapsack problem n=3,m=20 (p1,p2,p3) = (25,24,15) and (w1,w2,w3) =
(18,15,10).Find the feasible solutions and optimal solution using greedy method.

Solution:
Strategy 1: Arrange the objects in the increasing order of weights.

Weights 10 15 18
Profits 15 24 25
Objects 3 2 1

RC Objects selected weight Fraction of x added to knapsack

20 3 10 1
20 -10 =10 2 15 10/15=2/3
0 1 18 Cannot add as the knapsack is
full

The solution vector = (x1,x2, x3) = (0,2/3,1)

Profit earned = x1p1 + x2p2 + x3p3


= 0*25 + 2/3*24 + 1*15
= 31
Strategy 2: Arrange the objects in decreasing order of profits

Profits 25 24 15
Weights 18 15 10
Objects 1 2 3

RC Objects selected weight Fraction of x added to knapsack

20 1 18 1
20 -18 =2 2 15 2/15
0 3 10 Cannot add as the knapsack is
full

The solution vector = (x1, x2, x3) = (1,2/15,0)

Profit earned = x1p1 + x2p2 + x3p3


= 1*25 + 2/15*24 + 0*15
= 28.

Design and analysis of Algorithms - A Beginners Guide Page 10


Unit III : Greedy Method

Strategy 3: Arrange the objects in the decreasing order of pi / wi


Object 2: 24/15=1.6
Object 3: 15/10=1.5
Object 1: 25/18=1.38

RC Objects selected weight Fraction of x added to knapsack

20 2 15 1
20 -15 =5 3 10 5/10=1/2
0 1 18 Cannot add as the knapsack is
full

The solution vector = (x1, x2, x3) = (0, 1, 1/2)


Profit earned = x1p1 + x2p2 + x3p3
= 0*25+1*24+1/2*15
= 31.5

Strategy 4: Arrange the objects in the increasing order of pi / wi.


Object 1: 25/18=1.38
Object 3: 15/10=1.5
Object 2: 24/15=1.6

RC Objects selected weight Fraction of x added to knapsack

20 1 18 1
20 -18 =2 3 10 2/10=1/5
0 1 18 Cannot add as the knapsack is
full

The solution vector = (x1, x2, x3) = (1, 1/5,0)

Profit earned = x1p1 + x2p2 + x3p3


=1*25 + 1/5*15 + 0*24
= 28

The Feasible solutions are 31, 28, 31.5, 28

The Optimal solution is 31.5

Example 2:
Solve the greedy knapsack problem using fractional approach for earning the maximum
profit.
N=4, Knapsack capacity M=40
Object A B C D
Weight 20 10 15 12

Design and analysis of Algorithms - A Beginners Guide Page 11


Unit III : Greedy Method

Profit 2 8 4 6

Strategy 1: Arrange the objects in the increasing order of weights.

Weights 10 12 15 20
Profits 8 6 4 2
Objects B D C A

RC Objects selected weight Fraction of x added to knapsack

40 B 10 1
40 -10 =30 D 12 1
30-12=18 C 15 1
18-15=3 A 20(The 3/20
full
object
cannot be
added)

The solution vector = (x1,x2, x3,x4) = (3/20,1,1,1)

Profit earned = x1p1 + x2p2 + x3p3+x4p4


= 3/20*2+1*8+1*4+1*6
=0.3+8+4+6=18.3

Strategy 2: Arrange the objects in decreasing order of profits


Weights 10 12 15 20
Profits 8 6 4 2
Objects B D C A

RC Objects selected weight Fraction of x added to knapsack

40 B 10 1
40 -10 =30 D 12 1
30-12=18 C 15 1
18-15=3 A 20(The 3/20
full
object
cannot be
added)

The solution vector = (x1,x2, x3,x4) = (3/20,1,1,1)

Profit earned = x1p1 + x2p2 + x3p3+x4p4


= 3/20*2+1*8+1*4+1*6
=0.3+8+4+6=18.3

Design and analysis of Algorithms - A Beginners Guide Page 12


Unit III : Greedy Method

Strategy 3: Arrange the objects in the decreasing order of pi / wi


Calculate pi/wi
Object B: 8/10=0.8
Object D: 6/12=0.5
Object C: 4/15=0.26
Object A: 2/20=0.1

RC Objects selected weight Fraction of x added to knapsack

40 B 10 1
40 -10 =30 D 12 1
30-12=18 C 15 1
18-15=3 A 20(The 3/20
full
object
cannot be
added)

The solution vector = (x1,x2, x3,x4) = (3/20,1,1,1)

Profit earned = x1p1 + x2p2 + x3p3+x4p4


= 3/20*2+1*8+1*4+1*6
=0.3+8+4+6=18.3
The Feasible solutions are 18.3,18.3,18.3
The Optimal solution is 18.3

3.2.2 0/1 or Discrete Knapsack Problem


This allows the object to be placed in full only into the knapsack .It doesn’t allow
fractions. Therefore the solution vector contains either 0 or 1 only.
Example1:
For the knapsack problem n=3,m=20 (p1,p2,p3) = (25,24,15) and (w1,w2,w3) =
(18,15,10).Find the feasible solutions and optimal solution using greedy method.
Solution:
Strategy 1: Arrange the objects in the increasing order of weights.
Weights 10 15 18
Profits 15 24 25
Objects 3 2 1

RC Objects selected Weight Fraction of x added to knapsack

20 3 10 1
20 -10 =10 2 15 Cannot add as the knapsack
doesn’t have space for the full
object
0 1 18 Cannot add as the knapsack is
full

Design and analysis of Algorithms - A Beginners Guide Page 13


Unit III : Greedy Method

The solution vector = (x1, x2, x3) = (0, 0, 1)

Profit earned = x1p1 + x2p2 + x3p3


= 0*25 + 0*24 + 1*15
= 15
Strategy 2: Arrange the objects in decreasing order of profits

Profits 25 24 15
Weights 18 15 10
Objects 1 2 3

RC Objects selected Weight Fraction of x added to knapsack

20 1 18 1
20 -18 =2 2 15 Cannot add as the knapsack
doesn’t have space for the full
object
0 3 10 Cannot add as the knapsack is
full

The solution vector = (x1, x2, x3) = (1, 0, 0)

Profit earned = x1p1 + x2p2 + x3p3


= 1*25 + 0*24 + 0*15
= 25.

Strategy 3: Arrange the objects in the decreasing order of pi / wi


Object 2: 24/15=1.6
Object 3: 15/10=1.5
Object 1: 25/18=1.38

RC Objects selected Weight Fraction of x added to knapsack

20 2 15 1
20 -15 =5 3 10 Cannot add as the knapsack
doesn’t have space for the full
object
0 1 18 Cannot add as the knapsack is
full

The solution vector = (x1, x2, x3) = (0, 1, 0)


Profit earned = x1p1 + x2p2 + x3p3
= 0*25 + 1*24 + 0*15
= 24

Design and analysis of Algorithms - A Beginners Guide Page 14


Unit III : Greedy Method

Strategy 4: Arrange the objects in the increasing order of pi / wi.


Object 1: 25/18=1.38
Object 3: 15/10=1.5
Object 2: 24/15=1.6

RC Objects selected Weight Fraction of x added to knapsack

20 1 18 1
20 -18 =2 3 10 Cannot add as the knapsack
doesn’t have space for the full
object
0 1 18 Cannot add as the knapsack is
full

The solution vector = (x1, x2, x3) = (1, 0, 0)

Profit earned = x1p1 + x2p2 + x3p3


=1*25 + 0*15 + 0*24
= 25

The Feasible solutions are 15, 25, 24, 25

Therefore the Optimal solution is 25

Example 2:
Solve the greedy knapsack problem using fractional approach for earning the
maximum profit.
N=4, Knapsack capacity M=40
Object A B C D
Weight 20 10 15 12

Profit 2 8 4 6

Strategy 1: Arrange the objects in the increasing order of weights.

Weights 10 12 15 20
Profits 8 6 4 2
Objects B D C A

Design and analysis of Algorithms - A Beginners Guide Page 15


Unit III : Greedy Method

RC Objects weight Fraction of x added to


selected knapsack

40 B 10 1
40 -10 =30 D 12 1
30-12=18 C 15 1
18-15=3 A 20(The 0
object
cannot be
added)

The solution vector = (x1,x2, x3,x4) = (0,1,1,1)

Profit earned = x1p1 + x2p2 + x3p3+x4p4


= 0*2+1*8+1*4+1*6
=0+8+4+6=18

Strategy 2: Arrange the objects in decreasing order of profits


Weights 10 12 15 20
Profits 8 6 4 2
Objects B D C A

RC Objects weight Fraction of x added to


selected knapsack

40 B 10 1
40 -10 =30 D 12 1
30-12=18 C 15 1
18-15=3 A 20(The 0
object
cannot be
added)

The solution vector = (x1,x2, x3,x4) = (0,1,1,1)

Profit earned = x1p1 + x2p2 + x3p3+x4p4


= 0*2+1*8+1*4+1*6
=0+8+4+6=18
Strategy 3: Arrange the objects in the decreasing order of pi / wi
Calculate pi/wi
Object B: 8/10=0.8
Object D: 6/12=0.5
Object C: 4/15=0.26
Object A: 2/20=0.1

Design and analysis of Algorithms - A Beginners Guide Page 16


Unit III : Greedy Method

RC Objects weight Fraction of x added to


selected knapsack

40 B 10 1
40 -10 =30 D 12 1
30-12=18 C 15 1
18-15=3 A 20(The 0
object
cannot be
added)

The solution vector = (x1,x2, x3,x4) = (0,1,1,1)

Profit earned = x1p1 + x2p2 + x3p3+x4p4


= 0*2+1*8+1*4+1*6
=0+8+4+6=18
The Feasible solutions are 18,18,18
The Optimal solution is 18

Algorithm: Greedyknapsack0/1(M, n, w, P, x)
Assume the objects to be arranged in decreasing order of pi / wi. Let M be the
knapsack capacity, n is the number of edges, w [1 : n] is the array of weights, P[1 :
n]is the array of profits ,x is a solution vector.

Step 1: Remaining capacity RC =m


Step 2: Repeat for i=1 to n do
x[i] = 0
end for
Step 3: Repeat for i=1 to n do
if (w[ i ] < RC)
x[ i ]=1
RC=RC=w[ i ]
Step 4: else
x[ i ]= 0
break
End for
Step 5: For i=1 to n do
sum=sum+p [ i ]*x[ i ]
End for

3.3 Minimum Spanning Trees (MST)


A spanning tree for a connected undirected graph G=(V, E) is a sub graph of G that is
undirected tree and contains all the vertices of G.A spanning tree should contain all
the vertices and a subset of edges. In a weighted graph G=(V,E) the weight W of a sub

Design and analysis of Algorithms - A Beginners Guide Page 17


Unit III : Greedy Method

graph is the sum of the edges in the sub graph. A minimum spanning tree for a
weighted graph is spanning tree with minimum weight.
In a minimum spanning tree problem we are given with a connected undirected
graph G =(V, E) and a length function W such that W (e) is the length of edge
e,we are asked to find the subset of edges that connects all vertices together and
has a minimum total length.The two popular algorithms to find the minimum
cost spanning tree are,
 Kruskal’s Algorithm
 Prim’s Algorithm

3.3.1 Kruskal’s Algorithm


Let G= (V, E) be a directed graph . This algorithm selects n-1 edges one at time. The
algorithm selects the least cost edge (u, v) from E and deletes the edge (u, v) from E.
Then it checks whether the selected edge (u,v) results in a cycle when added to the list
of already selected edges to form a MST. If no cycles are formed, the selected edge is
added to the list of edges selected to form a MST. The selected edge which results in a
cycle are neglected .The procedure is repeated till all n-1 edges are selected and there
are no more edges to consider. If n-1 edges are selected, the selected edges results in
MST, otherwise spanning tree does not exist.

Algorithm to find then minimum spanning tree using Kruskal’s Algorithm


Early form of Kruskal’s algorithm
Algorithm : Kruskal’s (E,n)
//Let E be the list of edges
//n be the number of vertices in the given graph.
Sort E in the increasing order of their edge weights
Initially T = 0
while ( T doesnot contain n – 1 edges)
find the minimum cost edge not yet considered in E and call it as <u, v>
if( <u, v> does not form a cycle
T= T + <u, v>
else
delete <u , v>
end while
return (T)

Algorithm : Kruskal(n,m,E) - implémentation procedure

//Input: n-number of vertices in the graph


// m-number of edges in the graph
// E-an edge list consisting of set of edges along with their corresponding
weights
//Output: The MST along with the cost if exists; otherwise minimum spanning
tree does not exist.

Design and analysis of Algorithms - A Beginners Guide Page 18


Unit III : Greedy Method

1. [Initialization]
count = 0 //initial no of edges
k=0 //points of the first selected edge of MST
sum=0 //initial cost of MST

// create a forest with n vertices


for i= 0 to n do
parent[i]=i
endfor
2. Find the spanning tree if exists, along with the cost

While(count ≠ n-1) and (E≠0)


Select an edge (u,v) with least count
i←find (u,parent) //Find the parent of u and v
j←find(v,parent)

if( i != j) //if the parent of u and v are same then it


results in a cycle
t[k][0]=u
t[k][1]=v
k++
count++
sum←sum + cost(u,v)
union(i, j, parent)
endif
endwhile
3.[Output]
if (count = n-1)
write spanning tree exists
for(i=0 to n-2) do
write(t[i][0],t[i][1])
endfor
write(“cost of spanning tree is “,sum)
else
write(“spanning tree doesnot exist”)
endif

4.[Finished]

Design and analysis of Algorithms - A Beginners Guide Page 19


Unit III : Greedy Method

Example: Find the minimum spanning tree using Kruskal’s Algorithm

1 28
10
14 2

6 16
24 7
3
25 18
5
12
22 4

Solution: The minimum spanning tree can be obtained as shown below:

Arrange the edges in the increasing order of their weights


Edge Weight Edge Weight
1-6 10 4-5 22
3-4 12 5-7 24
2-7 14 5-6 25
2-3 16 1-2 28
4-7 18

Step1: The cost of the edge (6, 1) is least with weight 10 and hence it is selected

1
10

Step 2: The next least cost edge (3, 4) with weight 12 is selected

1
10

12
Design and analysis of Algorithms - A Beginners Guide 4 Page 20
Unit III : Greedy Method

Step 3: The next least cost edge (3, 4) with weight 12 is selected

1
10
14 2

6
7
3

12
4

Step3: Edge (2, 3) with weight 16 is the next least cost edge.

1
10
14 2

6 16
7
3

4 12
Step4: Edge (7, 4) with weight 18 is least cost edge but including it results in a cycle.
Hence it is rejected.
Step 5: Edge (4, 5) with weight 22 is the next least cost edge.

1
10
14 2

6 16
7
3

5
12
22 4

Step 6: The next least edge to be selected is (5, 7) with weight 24 but cannot be
selected because it results in a cycle.

Design and analysis of Algorithms - A Beginners Guide Page 21


Unit III : Greedy Method

Step 7: Edge (5, 6) with weight 25 is the next least cost edge.

1
10
14 2

6 16
7
3
25
5
12
22 4
Result: The cost of minimum spanning tree is given by the sum of weights of the
selected edges.
i.e., 10 +12 + 14 +16+22+25=99

Example 2:
Find the minimum spanning tree for the below graph using Kruskal’s Algorithm.

Arrange the edges in the increasing order of their weights


Edge Weight
b-f 1
c-e 1
c-f 2
c-d 3
b-c 5
d-f 7
b-e 9
d-e 14

Step 1: Consider the edge b-f

Design and analysis of Algorithms - A Beginners Guide Page 22


Unit III : Greedy Method

Step 2: Include the edge c-e

Step 3: Include the edge c-f

Step 4 : Include the edge c-d

Design and analysis of Algorithms - A Beginners Guide Page 23


Unit III : Greedy Method

Step 5 : Including the edge b-c will result in cycle so do not include

Step 6 : Inclusion of d-f also results in cycle

Step 7 : Considering the edge b-e also is not possible because of cycle formation

Design and analysis of Algorithms - A Beginners Guide Page 24


Unit III : Greedy Method

Step 8: We cannot the edge d-e as well because of cycle formation

Therefore the minimum spanning tree is shown below and the minimum cost is 7

3.3.2 Prim’s algorithm


Prim’s Algorithm is used to find the minimum spanning tree of a given graph through
a sequence of expanding sub trees. The initial sub tree is a sequence consisting of
some arbitrary vertex selected from the set V , of graph vertices. On each iteration the
current tree is expanded by simply attaching it to the nearest vertex (minimum cost
edge vertex).The algorithm stops after all the vertices have been included in the
spanning tree being constructed.

Algorithm :Prim’s(n, c)
//Assume G is connected, undirected and weighted graph.
//Input : The cost adjacency matrix C and number of vertices n.
//Output: Minimum weight spanning tree.
for i=1 to n do
visited [ i ]=0
u=1
visited [ u ] = 1
while( there is still an unchosen vertex ) do
Let <u, v> be the lightest edge between any chosen vertex u and unchosen vertex
v

Design and analysis of Algorithms - A Beginners Guide Page 25


Unit III : Greedy Method

visited [ v ] =1
T=union (T, <u,v>)
end while
return T

Algorithm Prim’s(n,cost) – implémentation procedure


//Input n- number of vertices in the graph
cost – cost adjacency matrix with value ≥0

//Output d-shortest distance from source to all other nodes


p-shortest path from source to destination
s-gives the nodes that are so far visited and the nodes that are
not visited.
Obtain a source vertex which has the least edge going out of it and call it
source
1.[Initialization]
for i←0 to n-1 do
s[i] = 0
p[i]=source
d[i]=cost[source,i]
endfor
2.[Add source to s]
s[source]←1
3.[Find the shortest distance and shortest path]
for i←1 to n-1 do
find u and d[u] such that d[u] is minimum and u ЄV-S
add u to s

for every v Є V-S do


if (d[u] + cost[u,v] < d[v])
d[v] = cost[u,v]
p[v]=u
endif
endfor
endfor
4.[Finished]

Design and analysis of Algorithms - A Beginners Guide Page 26


Unit III : Greedy Method

Example: Find the minimum spanning tree using Prim’s Algorithm

1 28
10
14 2

6 16
24 7
3
25 18
5
12
22 4

Solution: The minimum spanning tree can be obtained as shown below:


Construct the adjacency cost matrix
C 1 2 3 4 5 6 7
1 - 28 - - - 10 -
2 28 - 16 - - - 14
3 - 16 - 12 - - -
4 - - 12 - 22 - 18
5 - - - 22 - 25 24
6 10 - - - 25 - -
7 - 14 - 18 24 - -

Step1:
Initially S=0 and G=[1, 2, 3, 4, 5, 6, 7]
Now assume the starting node is 1.S=[1] and G= [2, 3, 4, 5, 6, 7].Looking into the cost
adjacency matrix ,the minimum edge is determined( the vertex from 1 to which the
cost is minimum)
Min{<1,2>, <1,3>, <1,4>, <1,5>, <1,6>, <1,7>}
There is an edge to vertex 2 and 6 from 1.
=min{28,10}=10
So draw an edge from vertex 1 to vertex 6
1
10

6
Step 2: Now S=[1,6] and G=[ 2, 3, 4, 5, 7]
Find Min{<1,2>, <1,3>, <1,4>, <1,5>, <1,7>,<6,2>, <6,3>,<6,4>,<6,5>, <6,7>}
There is an edge from vertex 1 to vertex 2 and an edge from vertex 6 to 5.
=min{28,25}=25

Design and analysis of Algorithms - A Beginners Guide Page 27


Unit III : Greedy Method

So draw an edge from vertex 6 to vertex 5.

1
1

Step 3: Now S=[1,6,5] and G=[ 2, 3, 4, 7]


Find Min{<1,2>, <1,3>, <1,4>, <1,7>,<6,2>, <6,3>,<6,4>,
<6,7>,<5,2>,<5,3>,<5,4>,<5,7>}
There is an edge from vertex 1 to vertex 2 and an edge from vertex 5 to 4 and 7.
=min{28,22,24}=22
So draw an edge from vertex 5 to vertex 4.
1
10

25

5
4
22
Step 4: Now S=[1,6,5,4] and G=[ 2, 3, 7]
Find Min{<1,2>, <1,3>, <1,7>,<6,2>, <6,3>,
<6,7>,<5,2>,<5,3>,<5,7>,<4,2>,<4,3>,<4,7>}
There is an edge from vertex 5 to vertex 7 and an edge from vertex 4 to 3, 7.
=min{28,24,12,18}=12

Design and analysis of Algorithms - A Beginners Guide Page 28


Unit III : Greedy Method

So draw an edge from vertex 4 to vertex 3.

1
10

6
3

25
12
5
4

22

Step 5: Now S=[1,6,5,4,3] and G=[ 2, 7]


Find Min{<1,2>, <1,7>,<6,2>, <6,7>,<5,2>,<5,7>,<4,2>,<4,7>,<3,2>, <3,7>}
There is an edge from vertex 1 to vertex 2, an edge from vertex 5 to 7 and an edge
from vertex 4 to 7 and an edge from vertex 3 to 2
=min{28,24,18,16}=16
So draw an edge from vertex 3 to vertex 2.

1 2
10

16
6
3

25
12
5
4

22

Step 5: Now S=[1,6,5,4,3,2] and G=[ 7 ]


Find Min{ <1,7>, <6,7>,<5,7>,<4,7>, <3,7>,<2,7>}
There is an edge from vertex 5 to vertex 7, an edge from vertex 4 to 7 and an edge
from vertex 2 to 7.

Design and analysis of Algorithms - A Beginners Guide Page 29


Unit III : Greedy Method

=min{24,18,14}=14
So draw an edge from vertex 2 to vertex 7.

1 2
10 14

16
6 7
3

25
12
5
4

22

Result: The cost of minimum spanning tree is given by the sum of weights of the
selected edges.
i.e., 10 +25 + 22 + 12 +16 + 14 =99

Example 2:
Find the minimum spanning tree for the below graph using Prim’s Algorithm.

Solution: The minimum spanning tree can be obtained as shown below:


Construct the adjacency cost matrix

C a b c d e f
b - - 5 - 9 1
c - 5 - 3 1 2
d - - 3 - 14 7
e - 9 1 14 - -
f - 1 2 7 - -

Design and analysis of Algorithms - A Beginners Guide Page 30


Unit III : Greedy Method

Step1:
Initially S=0 and G=[b,c,d,e,f ]
Now assume the starting node is 1.S=[b] and G= [c,d,e,f].Looking into the cost
adjacency matrix ,the minimum edge is determined( the vertex from b to which the
cost is minimum)
Min{<b,c>, <b,d>, <b,e>, <b,f>}=Min{5, -, 9, 1}=1
There is an edge to vertex b and f with minimum cost of 1.
So draw an edge from vertex b to vertex f

Step 2: Now S=[b,f] and G=[ c,d,e]


Find Min{<b,c>, <b,d>, <b,e>,<f,c>,f,d>,<f,e>} = Min{5, - , 9, 2, -, -} = 2
The edge from vertex f to vertex c can be considered as it gives the minimum cost of
So draw an edge from vertex f to vertex c.

Step 3: Now S=[b,f,c] and G=[ d,e]


Find Min{ <b,d>, <b,e>, f,d>,<f,e>,<c,d><c,e>} = Min{-, 9, 7, -, 3, 1} = 1
The edge from vertex f to vertex c to e can be considered as it gives the minimum cost
of 1
So draw an edge from vertex c to vertex e.

Design and analysis of Algorithms - A Beginners Guide Page 31


Unit III : Greedy Method

Step 3: Now S=[b,f,c,e] and G=[ d]


Find Min{ <b,d>, < f,d>,<c,d>,<e,d>} = Min{-,7,3, -} = 3
The edge from vertex e to vertex d is considered as it gives the minimum cost of 3
So draw an edge from vertex c to vertex e.

Result: The cost of minimum spanning tree is given by the sum of weights of the
selected edges. Equal to 1+1+2+3=7

**********************

Design and analysis of Algorithms - A Beginners Guide Page 32

You might also like