0% found this document useful (0 votes)
33 views35 pages

Greedy Method

Uploaded by

avijitghosh.ao
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views35 pages

Greedy Method

Uploaded by

avijitghosh.ao
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 35

Greedy Method

Dr. Md. Ali Hossain


Professor
Department of CSE
Email. [email protected]
Greedy Method
■ It is a design technique applied to those problems
having n inputs and requires us to obtain a subset that
satisfies some constraints.

■ Any subset that satisfies these constraints is called a


feasible solution.

■ Ultimate goal is to find a feasible solution that


minimizes or maximizes an objective function; this
solution is known as optimal solution.
3 -2
Greedy Algorithm
// a[1:n] contains the n inputs
#Algorithm Greedy ( a, n )
{
solution = Ø; // Initialize the solution
for i = 1 to n do
{
x = Select ( a );
if Feasible ( solution, x ) ) then
solution = Union( solution, x)
}
return ( solution );
}
3 -3
Knapsack

■ A 1998 study of the Stony Brook University


Algorithm Repository showed that, out of 75
algorithmic problems, the knapsack problem
was the 18th most popular and the 4th most
needed after kd-trees, suffix trees, and the bin
packing problem

3 -4
Knapsack problem

■ Given a set of items, each with a mass and a


value, determine the number of each item to
include in a collection so that the total weight
is less than or equal to a given limit and the
total value is as large as possible

■ It derives its name from the problem faced by


someone who is constrained by a fixed-size
knapsack and must fill it with the most
valuable items
3 -5
Fill Your Shopping Bag: Optimize
Money!!!

Answer: 3 yellow boxes and 3 grey boxes .


■ Which boxes should be chosen to maximize the
amount of money while still keeping the overall
weight under or equal to 15 kg?
3 -6
Knapsack Problem
■ To get maximum profit, such that weight in the knapsack
do not exceed its capacity.
■ Given ‘n’ objects and a knapsack(bag).
■ Object i have a weight Wi, and the knapsack has a
capacity m.
■ if object i is placed into knapsack then a profit of Pi is
earned.
■ Since the knapsack capacity is m, we require the total
weight of all chosen objects to be at most m.

3 -7
Knapsack Problem Contd..

■ Wi xi > m (problem!!!. Bag is full already)


■ Wi xi ≤ m (item placed such that not to exceed
the weight of bag)

3 -8
Knapsack Problem Contd..

■ Knapsack problem can be summarize as follows:

3 -9
Types of Knapsack Problem

There are two versions of the problem:


▪ “0-1 knapsack problem”
▪ Items are indivisible; you either take an item or not.
Some special instances can be solved with dynamic
programming

▪ “Fractional knapsack problem”


▪ Items are divisible: you can take any fraction of an item

3 -10
Example #1
Weight Benefit
Item wi bi
#
1 2 3
w1 =2 w2 =4 w3 =5 w5 =9 2 4 5
b1 =3 b2 =5 b3 =8 b5 =10 S 3 5 8
4 3 4
For S:
5 9 10
Total weight: 20
Maximum benefit: 26

3 -11
Example #2
w1 =2 w2 =4 w3 =5 w4 =3 Weight Benefit
b1 =3 b2 =5 b3 =8 b4 =4 Item wi bi
#
? 1 2 3
Max weight: W = 20 S4 2 4 5
For S4:
Total weight: 14 S5 3 5 8
Maximum benefit: 20
4 3 4
5 9 10
w1 =2 w2 =4 w3 =5 w5 =9
b1 =3 b2 =5 b3 =8 b5 =10

For S5:
Solution for S4 is
Total weight: 20 not part of the
Maximum benefit: 26 solution for S5!!! 3 -12
Analyze the Algorithm in Depth

Solution 1
Solution 2
Solution 3
Solution 4

Q. Find out the optimal solution (i.e., maximize profit in capacity


of 20)

3 -13
Solution 1

■ Objective: Maximize the total profit


■ Strategy 1: Take the objects with maximum profit.

■ Solution :

Solution vector:

X=[1, 2/15, 0] 20 28.2

3 -14
Solution 2
■ Analytical result: Considering objects in
decreasing of profit will not yield the optimal
solution
■ Strategy 2: Take the objects with minimum
weight.
■ Solution :

Solution vector:

X=[0, 2/3, 1] 20 31

3 -15
Solution 3
■ Analytical result: Considering objects in
decreasing of weight will also yield the
suboptimal solution
■ Strategy 3: Take the objects with maximum
profit per unit of capacity used.
■ Order the weights and profit according to the
ratio of pi/wi
■ Solution :
Solution vector:

X=[0, 1, 1/2] 20 31.5 Bravo: Optimal solution


reached. 3 -16
Greedy Knapsack algorithm
#Algorithm GreedyKnapsack( m, n )
//P[1:n] and w[1:n] contains the profit and weight respectively in decreasing order
// of pi/wi. m=knapsack size.
{
for i = 1 to n do x[i]=0.0 //initialize the solution vector
U= m;
for i = 1 to n do
{
if (x[i]>U) Then break;
x[i] = 1.0; U=U-w[i];
}
if(i<=n) then x[i]=U/w[i];
}
3 -17
Job Sequencing with Deadlines

3 -18
Job Sequencing with Deadlines
■ Example:
n=4 Jobs with profit (p1,p2,p3,p4)=(100,10,15,27)
and deadlines
(d1,d2,d3,d4)=(2,1,2,1)
Feasible Processing
Solution sequence Value
1. (1, 2) 2, 1 110
2. (1, 3) 1, 3 or 3, 1 115
3. (1, 4) 4, 1 127
4. (2, 3) 2, 3 25
5. (3, 4) 4, 3 42
6. (1) 1 100
7. (2) 2 10
8. (3) 3 15
9. (4) 4 27 3 -19
Job Sequencing with Deadlines Contd..
■ Greedy strategy using total profit as optimization function
■ Applying to previous example
■ Begin with J=φ

■ Job 1 considered, and added to J 🡪 J={1}


■ Job 4 considered, and added to J 🡪 J={1,4}
■ Job 3 considered, but discarded because not feasible 🡪 J={1,4}
■ Job 2 considered, but discarded because not feasible 🡪 J={1,4}
■ Final solution is J={1,4} with total profit 127

■ It is optimal
■ How to determine the feasibility of J ?
■ Trying out all the permutations
■ Computational explosion since there are n! permutations

■ Possible by checking only one permutation


■ By Theorem 4.3

■ Theorem 4.3 Let J be a set of k jobs and a permutation of


jobs in J such that Then J is a feasible solution iff the
jobs in J can be processed in the order without violating any deadline.

3 -20
Job Sequencing with Deadlines Contd..
■ The greedy method described above always obtains an optimal
solution to the job sequencing problem.
■ High level description of job sequencing algorithm
■ Assuming the jobs are ordered such that p[1]≥p[2]≥…≥p[n]

GreedyJob(int a[], set J, int n)


// J is a set of jobs that can be
// completed by their deadlines.
{
J = {1};
for (int i=2; i<=n; i++) {
if (all jobs in J ∪{i} can be completed
by their deadlines) J = J ∪{i};
}
}

3 -21
Spanning Trees

3 -22
Minimum Spanning Trees

■ A connected graph with n vertices and n-1 edges


is a trees.
■ Minimum spanning tree(MST) : A spanning tree
with the smallest total weight/cost.
■ Let G = (V, E) be an undirected connected
graph.
■ A subgraph t = (V, E’) of G is a spanning tree of
G iff t is a tree.
■ Spanning tree is a minimal subgraph G’ of a
graph G such that V (G’) = V (G) and G’ is
connected 3 -23
An Example of MST

■ A graph and one of its minimum costs spanning


tree

3 -24
Prim’s Algorithm

■ In Prim’s algorithm, grow the tree one edge at a time


• The edge is chosen based on some optimization
criterion
• Select an edge that increases the overall weight of the
tree by a minimum amount.
■ Set of edges A selected thus far forms a tree
■ The next edge (u, v) to be included in A is a
minimum cost edge not in A with the property that
AU(u, v) is also a tree

3 -25
Growing MST
■ Manage a set A that is always a subset of some
minimum spanning tree
■ At each step, an edge (u, v) is determined such
that AU (u, v) is also a subset of a minimum
spanning tree
■ (u, v) is called a safe edge
■ Two popular algorithms used to implement
minimum spanning tree
• Prims algorithm
• Kruskal’s algorithm

3 -26
Prim’s Algorithm
■ Start with any arbitrarily selected vertex
■ A way to select the starting point is to start with the two
vertices of the minimum cost edge
■ The set A of edges so far selected form a tree
■ The next edge (u, v) to be included in A is a minimum
cost edge not in A such that A U (u, v) is also a tree
■ With each vertex, associate a value called near[j] with
each vertex j that is not yet included in the tree
■ near[j] refers to the vertex in the tree such that
cost(j, near[j]) is minimum for all choices for near[j]

3 -27
Prim’s Algorithm Contd..

■ Change near[j] to α for all vertices j that are


already in the tree
■ Add a vertex to the set by looking for the
closest vertex to the current tree
■ The time complexity of Prims algorithm is
O(n2) where n is the number of vertices.

3 -28
Example # for Prim’s Algorithm

3 -29
Example #2 for Prim’s Algorithm

3 -30
Kruskal’s Algorithm (MST)
Step 1: Sort all edges into non-decreasing order of
cost.
Step 2: Add the next smallest weight edge to the forest
if it will not create a cycle.
Step 3: Stop if n-1 edges. Otherwise, go to Step2.

3 -31
Example#1 of Kruskal’s
Algorithm

Edge Cost
(1,6) 10
(3,4) 12
(2,7) 14
(2, 3) 16
(4, 7) X 18
(4, 5) 22
(5, 7) X 24
(5, 6) 25
Total cost: 99 (1, 2) X 28
3 -32
Example#2 for Kruskal’s Algorithm

3 -33
Time Complexity

■ Time complexity: O(|E| log|E|)


■ |E| is the number of edges

3 -34
Thanks for the attention

3 -35

You might also like