Greedy Method
Greedy Method
General characteristics:-
1.Problems will have n inputs.
2.From the input set, find a subset that satisfies some constraints. This subset is
called feasible solution.
3.Feasible solution either maximizes or minimizes a give objective function. This
feasible solution is called an optimal solution.
4.Greedy algorithm works in stages, by considering one input at a time.
5.Each stage, a decision is made regarding whether the input considered yields an
optimal solution. If so, the input is considered otherwise, rejected.
6.Inputs are considered by a selection procedure. Selection procedure is based on
some optimization measure.
7.If the inclusion of the next input into the partially constructed optimal solution
results in an infeasible solution, then this input is not added to the partial solution,
otherwise, it is added.This measure may be the objective function.
Greedy method
Examples:
Knapsack problem
MST
Optimal merge patterns
Single source shortest path
Tree vertex splitting
Job sequencing with dead lines
Optimal storage on tapes
General algorithm---CONTROL
Algorithm greedy (a, n)
ABSTRACTION
{
Solution: = ø
For i = 1 to n do
{
x = select (a);
If feasible (solution, x) then
solution = union (solution, x)
}
Return solution;
}
Knapsack problem
Statement:
Assume there are n objects and a knapsack or bag, objects has a weight
Wi. let m be the capacity of knapsack or bag. Let Pi be the profit knapsack or bag.
Let Pi be the profit knapsack or bag. Let Pi be the profit associated with each
object. Let Xi be the fraction of each object i. If a fraction X i of object i is placed in a
bag or knapsack, then the profit PiXi is earned. The objective is to fill the knapsack
and maximize the profit.
Maximize n∑i=1 PiXi ------------------- (1)
Subject n∑i=1 Wi Xi <= m ------------ (2)
0<= Xi <=1
Where profits and weights are positive numbers
A feasible solution is any set (X1, …….. Xn) satisfying the equations 2.
An optimal solution is a feasible solution for which equation 1 is maximized.
Knaspsack algorithm
m=capacity of the
bag
Alogorithm greedy knapsack (m, n) n=number of
{ objects
For i=1 to n do W [1…n] = weights
X[i] =0.0; of the objects
U=m; P [1…n] = profit of
For i=1 to n do the objects
{
If(w[i]>u) then break;
X[i] = 1.0;
U= U-W[i]; Selection procedure:
} Objects are ordered such
If(i<=n) then X[i] = U/W[i]; that (P[i]/W[i]) >=
} (P[i+1]/W[i+1])
Knapsack problems
N=7 m=15
P1,p2,p3,p4,p5,p6,p7=12,5,15,7,6,18,3
W1,w2,w3,w4,w5,w6,w7=2,3,5,7,1,4,1
Minimum spanning tree
MINIMUM SPANNING TREE APPLICATIONS OF MST
Given a connected, undirected, Electric network
weighted graph, a spanning tree of that G/ € G , V(G/) = V(G)
graph is a sub graph which is a tree and Link/communication
connects all the vertices together. If problems, all cities
weights are assigned, MST is one with connected with minimum
the lowest total cost. cost.
A graph can have many different
spanning trees.
PRIM’s Algorithm
Algorithm:
Procedure Prim (G: weighted connected graph with n vertices)
{
T= a minimum weighted edge
For i = 1 to n-2
Begin
{
E = an edge of minimum weight incident to a vertex in T and
not forming a cycle in T if added in T.
T = T with e added
}
End
Return (T)
}
Prim’s algorithm
Kruskal’s Algorithm
Algorithm (G: weighted connected graph with n vertices)
{
T=ø
While t has less than n-1 edges and ( E ≠ 0 ) do
{
Choose an edge (v, w) from E of lowest cost;
Delete v, w from E;
If (v, w) does not create a cycle in T , then add (v, w) to T;
Else
Discard (v, w);
}
}
Graph Algorithms
• Single source shortest path
– Dijkstra’s algorithm
• Graph with positive edges
• Greedy approach
– Bellman Ford algorithm
• Graph with positive and negative edges
• Dynamic programming approach
Dijkstra’s Algorithm
Algorithm Shortestpaths (v, cost, dist, n)
{ V----source vertex
for i=1 to n do Cost---cost adjacency matrix
{s[i]=false; N----number of vertex
dist[i]=cost[v,i]; Dist ---output parameter
}
s[v]=true;
dist[v]=0.0;
For num= 2 to n-1 do
{
Choose U from among those vertics not in S such that dist[u] is
minimum;
s[U]=true;
for (each w adjacent to U with s[w]= false) do
if (dist[w]> (dist[u]+cost[u,w])) then
dist[w]=dist[u]+cost[u,w];
}
}
Dijkstra’s Algorithm
Source 5
Drawbacks of Dijkstra’s algorithm
Optimal merge pattern
Merge
x1,x2,x3,x4
(sorted files)
Optimal merge pattern
• Different pairing require different computing
time
• Optimal way is to use few comparisons to
merge the file
• Merge patterns can be represented as binary
tree and called as two way merge patterns
• Greedy approach
– Merge the two smallest file together
Example x1,x2,x3={30,20,10}