Greedy Method
Greedy Method
3 -4
Knapsack problem
3 -7
Knapsack Problem Contd..
3 -8
Knapsack Problem Contd..
3 -9
Types of Knapsack Problem
3 -10
Example #1
Weight Benefit
Item wi bi
#
1 2 3
w1 =2 w2 =4 w3 =5 w5 =9 2 4 5
b1 =3 b2 =5 b3 =8 b5 =10 S 3 5 8
4 3 4
For S:
5 9 10
Total weight: 20
Maximum benefit: 26
3 -11
Example #2
w1 =2 w2 =4 w3 =5 w4 =3 Weight Benefit
b1 =3 b2 =5 b3 =8 b4 =4 Item wi bi
#
? 1 2 3
Max weight: W = 20 S4 2 4 5
For S4:
Total weight: 14 S5 3 5 8
Maximum benefit: 20
4 3 4
5 9 10
w1 =2 w2 =4 w3 =5 w5 =9
b1 =3 b2 =5 b3 =8 b5 =10
For S5:
Solution for S4 is
Total weight: 20 not part of the
Maximum benefit: 26 solution for S5!!! 3 -12
Analyze the Algorithm in Depth
Solution 1
Solution 2
Solution 3
Solution 4
3 -13
Solution 1
■ Solution :
Solution vector:
3 -14
Solution 2
■ Analytical result: Considering objects in
decreasing of profit will not yield the optimal
solution
■ Strategy 2: Take the objects with minimum
weight.
■ Solution :
Solution vector:
X=[0, 2/3, 1] 20 31
3 -15
Solution 3
■ Analytical result: Considering objects in
decreasing of weight will also yield the
suboptimal solution
■ Strategy 3: Take the objects with maximum
profit per unit of capacity used.
■ Order the weights and profit according to the
ratio of pi/wi
■ Solution :
Solution vector:
3 -18
Job Sequencing with Deadlines
■ Example:
n=4 Jobs with profit (p1,p2,p3,p4)=(100,10,15,27)
and deadlines
(d1,d2,d3,d4)=(2,1,2,1)
Feasible Processing
Solution sequence Value
1. (1, 2) 2, 1 110
2. (1, 3) 1, 3 or 3, 1 115
3. (1, 4) 4, 1 127
4. (2, 3) 2, 3 25
5. (3, 4) 4, 3 42
6. (1) 1 100
7. (2) 2 10
8. (3) 3 15
9. (4) 4 27 3 -19
Job Sequencing with Deadlines Contd..
■ Greedy strategy using total profit as optimization function
■ Applying to previous example
■ Begin with J=φ
■ It is optimal
■ How to determine the feasibility of J ?
■ Trying out all the permutations
■ Computational explosion since there are n! permutations
3 -20
Job Sequencing with Deadlines Contd..
■ The greedy method described above always obtains an optimal
solution to the job sequencing problem.
■ High level description of job sequencing algorithm
■ Assuming the jobs are ordered such that p[1]≥p[2]≥…≥p[n]
3 -21
Spanning Trees
3 -22
Minimum Spanning Trees
3 -24
Prim’s Algorithm
3 -25
Growing MST
■ Manage a set A that is always a subset of some
minimum spanning tree
■ At each step, an edge (u, v) is determined such
that AU (u, v) is also a subset of a minimum
spanning tree
■ (u, v) is called a safe edge
■ Two popular algorithms used to implement
minimum spanning tree
• Prims algorithm
• Kruskal’s algorithm
3 -26
Prim’s Algorithm
■ Start with any arbitrarily selected vertex
■ A way to select the starting point is to start with the two
vertices of the minimum cost edge
■ The set A of edges so far selected form a tree
■ The next edge (u, v) to be included in A is a minimum
cost edge not in A such that A U (u, v) is also a tree
■ With each vertex, associate a value called near[j] with
each vertex j that is not yet included in the tree
■ near[j] refers to the vertex in the tree such that
cost(j, near[j]) is minimum for all choices for near[j]
■
3 -27
Prim’s Algorithm Contd..
3 -28
Example # for Prim’s Algorithm
3 -29
Example #2 for Prim’s Algorithm
3 -30
Kruskal’s Algorithm (MST)
Step 1: Sort all edges into non-decreasing order of
cost.
Step 2: Add the next smallest weight edge to the forest
if it will not create a cycle.
Step 3: Stop if n-1 edges. Otherwise, go to Step2.
3 -31
Example#1 of Kruskal’s
Algorithm
Edge Cost
(1,6) 10
(3,4) 12
(2,7) 14
(2, 3) 16
(4, 7) X 18
(4, 5) 22
(5, 7) X 24
(5, 6) 25
Total cost: 99 (1, 2) X 28
3 -32
Example#2 for Kruskal’s Algorithm
3 -33
Time Complexity
3 -34
Thanks for the attention
3 -35