Dynamic Programming-Unit-4.ppt
Dynamic Programming-Unit-4.ppt
Unit-4
Dynamic Programming
1
The General Method
• An algorithm design method that can be used
• When the solution to a problem can be viewed
• as the result of a sequence of decisions.
• Step wise decisions can be made for the
problem like – knapsack problem, job
sequencing with deadline, optimal merge
pattern etc.
• Which can be solved using Greedy Approach
2
The General Method
• For some problems it is not possible to make
a sequence of stepwise decisions, which will
give optimal decision sequence.
• Solution to such problems is – to try all
possible decision sequences
• Enumerate all decision sequences and then
pick out best.
• But the time and space requirement may be
prohibitive.
3
The General Method
• Dynamic Programming drastically reduces
the amount of time and space required.
• It avoids the enumeration of some decision
sequences , that cannot possibly be optimal
4
Difference between Greedy Method
and Dynamic Programming
• In Greedy Method, only one decision
sequence is generated.
• In Dynamic Programming, many decision
sequences can be generated
• But sequences containing suboptimal
subsequences can not be optimal
• And so will not be generated
5
Multistage Graph
• A multistage graph G=(V, E) is a directed
graph, in which
• The vertices are partitioned into k ≥ 2 disjoint
sets Vi, 1≤i<k
• Sets V1 and Vk are such that │V1│ = │Vk│ =1
• Let s and t be the vertices in V1 and Vk
respectively.
• s is source and t is sink (destination)
6
Multistage Graph
• Let c(i, j) be the cost of edge (i, j)
• The cost of a path from s to t is the sum of the
costs of all the edges on the path
• The multistage graph problem is to find a
minimum-cost path from s to t
• Each stage Vi defines a stage in the graph
• Constraint is, every path from s to t starts in
stage 1, goes to stage 2 and so on
• And terminates in k stage 7
The shortest path in multistage
graphs
• e.g.
9
Multistage Graph- Example
10
Multistage Graph- Example
• Every s to t path is the result of a sequence of
k-2 decisions
• i th decision involves determining which vertex
in Vi+1 is to be on the path
• cost(i, j)=min {c(j, l) + cost (i+1, l)}
• Cost(1,1)= min{9+cost(2,2), 7+cost(2,3),
3+cost(2,4), 2+cost(2,5)}
• Forward Approach - backward reasoning.
• Calculate Distance from Target as reference 11
Multistage Graph- Example
• Cost(1,1)= min{9+cost(2,2), 7+cost(2,3),
3+cost(2,4), 2+cost(2,5)}
• Cost(4,9)=c(9,12)+cost(5,12)
=4+0 = 4
• Cost(4,10)=c(10,12)+cost(5,12)
=2+0 = 2
• Cost(4,11)=c(11,12)+ cost(5,12)
=5+0 =5
12
Multistage Graph- Example
• cost(3,6) = min{6+cost(4,9), 5+cost(4,10)}
=min(6+4, 5+2) = min( 10,7)=7
13
Multistage Graph- Example
• cost(2,2)
=min{4+cost(3,6), 2+cost(3,7), 1+cost(3,8)}
= min(4+7, 2+5, 1+7) = 7
• cost(2,3)
=min{2+cost(3,6), 7+cost(3, 7)}
=min(2+7, 7+5)= 9
14
Multistage Graph- Example
• cost(2,4)
=min{11+cost(3,8)}
=min(11+7)=18
• cost(2,5)
=mint{11+cost(3,7), 8+cost(3,8)}
=min(11+5, 8+7)=15
15
Multistage Graph- Example
16
Multistage Graph- Example
• Cost(1,1)
= min{9+cost(2,2), 7+cost(2,3), 3+cost(2,4),
2+cost(2,5)}
=min(9+7, 7+9, 3+18, 2+15)
=16
• A minimum cost s to t path has a cost 16
• This path can be determined easily if we
record the decision made at each state
(vertex) 17
Multistage Graph- Example
• Let d(i, j) be the value of l (l is a node in next
level) that minimizes {c(j, l) + cost (i+1, l)}
18
Multistage Graph- Example
• Let the minimum-cost path be
• s,v2,v3,..vk-1,t
• V2=d(1,1)=2 v3=d(2,2)=7
• v4=d(3,7)=10
• So the path is 1, 2, 7, 10, 12
• Cost= 16
19
Multistage Graph- Example
• Let the minimum-cost path be
• s,v2,v3,..vk-1,t
• V2=d(1,1)=3 v3=d(2,3)=6
• v4=d(3,6)=10
• So the path is 1, 3, 6, 10, 12
• Cost= 16
20
Multistage Graph- Example
21
Multistage Graph- Backward Approach
• backward Approach - Forward reasoning.
• Calculate Distance from Source as reference
• bcost(i, j) = min(bcost(i-1, l)+c(l, j))
• bcost(5,12)= min{4+cost(4,9), 2+cost(4,10),
5+cost(4,11)}
22
Multistage Graph- Example
23
Multistage Graph- Backward Approach
• bcost(i, j) = min(bcost(i-1, l)+c(l, j))
• bcost(2, 2)=9
• bcost(2, 3)=7
• bcost(2, 4)=3
• bcost(2, 5)=2
• bcost(3, 6)= min{ bcost(2, 2)+c(2, 6)
bcost(2, 3)+c(3, 6)}
=min{9+4, 7+2} = 9
24
Multistage Graph- Backward Approach
• bcost(3, 7)=11
• bcost(3, 8)=10
• bcost(4, 9)=15
• bcost(4, 10)=14
• bcost(4, 11)=16
• bcost(5, 12)=16
•
25
Multistage Graph
• Example-
• Consider a resource allocation problem
• n units of resource are to be allocated to r
projects
• If j, 0≤ j ≤ n, units of the resource are allocated
to project i,
• Then the resulting net profit is N(i, j)
26
Resource allocation problem
• The problem is to allocate the resource to the
r projects
• in such a way as to maximize total net profit
• This problem can be formulated as an r+1
stage graph problem
• Stage i, 1≤ i ≤ r represents project i
• There are n+1 vertices V(i, j), 0≤ j ≤ n
associated with stage i, 2≤ i ≤ r
27
Resource allocation problem
28
All-Pairs Shortest Paths (APSP)
• Let G=(V,E) be a directed graph with n
vertices.
• Let cost be a adjacency matrix for G, such that
cost(i, i)=0 ,1≤ i ≤ n
• cost(i, j) is the length (or cost) of edge (i,j)
if (i, j) Є E(G)
29
All-Pairs Shortest Paths (APSP)
• All pair shortest path problem is to determine a
matrix A
• such that A(i, j) is the length of shortest path
from i to j.
• A can be obtained by solving n Single source
shortest path problems
– Complexity n*n2 i.e. O(n3)
30
All-Pairs Shortest Paths (APSP)
Shortest path from i to j, i≠j
• Path may go through some intermediate vertex
• If k is the index of this intermediate vertex then
path (i,k) and (k,j) must be shortest path
• If k is the intermediate vertex with highest index
then
– i to k path is the shortest i to k path going
through no vertex with index greater than k-1
– So is the k to j path
• We need to decide highest index intermediate
vertex k
31
All-Pairs Shortest Paths (APSP)
•Let Ak(i, j) be length of shortest path from i to j
going through no vertex of index greater than k
32
33
All-Pairs Shortest Paths (APSP)
1
4 2
2
3 11
4 4
3
17
34
All-Pairs Shortest Paths (APSP)
• A0
1 2 3 4
1 0 4 11 ∞
2 6 0 ∞ 2
3 3 ∞ 0 17
4 ∞ ∞ 4 0
35
All-Pairs Shortest Paths (APSP)
• A1
1 2 3 4
1 0 4 11 ∞
2 6 0 17 2
3 3 7 0 17
4 ∞ ∞ 4 0
36
All-Pairs Shortest Paths (APSP)
• A2
1 2 3 4
1 0 4 11 6
2 6 0 17 2
3 3 7 0 9
4 ∞ ∞ 4 0
37
All-Pairs Shortest Paths (APSP)
• A3
1 2 3 4
1 0 4 11 6
2 6 0 17 2
3 3 7 0 9
4 7 11 4 0
38
All-Pairs Shortest Paths (APSP)
• A4
1 2 3 4
1 0 4 10 6
2 6 0 6 2
3 3 7 0 9
4 7 11 4 0
39
Optimal Binary Search Tree
• Given set of identifiers, different binary trees
could be formed.
• Average number of comparisons needed for
finding the identifiers will be different for different
trees.
• In the simplest form we assume
– Probability of search of each element is equal
– No unsuccessful searches made
40
Optimal Binary Search Tree
41
Optimal Binary Search Tree
• In general situation different identifiers can be
searched with different probabilities
• and unsuccessful searches can also be made
• Let set of identifiers be {a1,a2, ..,an} with
a1 <a2 < .. < an
• Let p(i) be the probability with which we
search ai.
42
Optimal Binary Search Tree
• Let q(i) be the probability with which we search
x such that ai < x < ai+1, , 0≤i≤n.
– Assume a0= -∞ and an+1 = ∞.
• Then
• Σ q(i) is the probability of unsuccessful search
0≤i≤n
• Clearly
• Σ p(i) + Σ q(i) =1
1≤i≤n 0≤i≤n
43
Optimal Binary Search Tree
44
Optimal Binary Search Tree
•(a1, a2, a3) = (do, if, int)
• If all searches (successful and un-successful)
equally probable (1/7)
• the cost will be
•Cost(tree A)=1/7+2/7+3/7+1/7+2/7+3/7+3/7
•=15/7
Cost(B)=13/7
Cost(C)=15/7
Cost(D)=15/7
Cost(E)=15/7
▪Tree B is optimal 45
Optimal Binary Search Tree
•(a1, a2, a3) = (do, if, int)
•p(1)=0.5, p(2)=0.1, p(3)=0.05 and
• q(0)=0.15, q(1)=0.1, q(2)=0.05, q(3)=0.05
• the costs will be
•cost(A)=((0.5 X 3) + (0.1 X 2)+ (0.05 X 1)) +
((0.15 X 3)+(0.1 X 3)+ (0.05 X2)+(0.05 X1))
•= 1.5+0.2+0.05+0.45+0.3+0.1+0.05
•= 2.65
46
Optimal Binary Search Tree
•cost(B)=((0.5 X 2)+(0.1 X 1)+(0.05 X 2)) +
((0.15 X 2)+(0.1 X 2)+ (0.05 X2)+(0.05 X2))
=1.0+0.1+0.1+0.30+0.2+0.1+0.1
=1.9
Root of sub-tree is
assumed to be present at
level 1
51
Applying Dynamic Programming Approach
is minimum
53
Applying Dynamic Programming Approach
• Hence for c(0, n) we obtain
54
Applying Dynamic Programming Approach
• This equation can be solved for c(0, n)
• By first computing all c(i, j), such that j-i =1
• c(i, i)=0 w(i, i)=q(i) 0≤i≤n
• Next computing all c(i, j), such that j-i =2 and
then computing all c(i, j), such that j-i =3 so on
• During this computation we record the root
r(i, j) of each tree tij ,
• then optimal binary search tree can be
constructed from these r(i, j) 55
56
57
Applying Dynamic Programming Approach
• w(i, j)= p(j)+q(j)+w(i, j-1)
• w(0, 2) =p(2)+q(2)+w(0,1)
=3+1+8
=12
c(0,2)=min{c(0,0)+c(1,2), c(0,1)+c(2,2)}+w(0,2)
=min{0+3, 7+0}+12
=3+12= 15
r(0, 2) = 1; 58
Applying Dynamic Programming Approach
• w(i, j)= p(j)+q(j)+w(i, j-1)
• w(1, 3) =p(3)+q(3)+w(1,2)
=1+1+7
=9
c(1,3)=min{c(1,1)+c(2,3), c(1,2)+c(3,3)}+w(1,3)
=min{0+3, 7+0}+9
=3+9= 12
r(1, 3) = 2; 59
Applying Dynamic Programming Approach
• w(i, j)= p(j)+q(j)+w(i, j-1)
• w(2, 4) =p(4)+q(4)+w(2,3)
=1+1+3
=5
c(2,4)=min{c(2,2)+c(3,4), c(2,3)+c(4,4)}+w(2,4)
=min{0+3, 3+0}+5
=3+5= 8
r(2, 4) = 3; 60
Applying Dynamic Programming Approach
w(0, 3)=p(3)+q(3)+w(0,2)
=1+1+12
=14
c(0,3)=min{c(0,0)+c(1,3), c(0,1)+c(2,3),
c(0,2)+c(3,3)} +w(0,3)
=min{0+12, 8+3, 19+0}+14
=11+14= 25
r(0, 3) = 2;
61
Applying Dynamic Programming Approach
w(1, 4)=p(4)+q(4)+w(1,3)
=1+1+9
=11
c(1,4)=min{c(1,1)+c(2,4), c(1,2)+c(3,4),
c(1,3)+c(4,4)} +w(1,4)
=min{0+8, 7+3, 12+0}+11
=8+11= 19
r(1, 4) = 2;
62
Applying Dynamic Programming Approach
w(0, 4)=p(4)+q(4)+w(0,3)
=1+1+14
=16
r(0,4)=min{c(0,0)+c(1,4), c(0,1)+c(2,4),
c(0,2)+c(3,4), c(0,3)+c(4,4)} +w(0,4)
=min{0+19, 8+8, 19+3, 25+0 }+16
=16+16= 32
r(0, 4) = 2;
63
64
65
0 / 1 Knapsack
66
0 / 1 Knapsack
• If Si+1 contains two pairs (Pj, Wj) and (Pk, Wk)
with the values Pj ≤ Pk and Wj ≥ Wk
• Then pair (Pj, Wj) can be discarded
• It is called Dominance Rule/ Purging Rule.
67
68
0 / 1 Knapsack
76
Reliability Design
• Add 3 devices of type D2
• Reliability of 3 devices of type D2 connected in
parellel
• 1-(1-r)m = 1-(1-0.8)3 = 1- 0.23= 0.992
• s23 ={ (0.8928, 75) }
•s2 ={ (0.72, 45), (0.792, 75), (0.864, 60), (0.9504,
90), (0.8928, 75) }
•s2 ={(0.72, 45), (0.864, 60), (0.8928, 75) }
77
Reliability Design
• Add 1 device of type D3
• s31 ={(0.36, 65), (0.432, 80), (0.4464, 95)}
• Add 2 devices of type D3
• Reliability of 2 devices of type D3 connected in
parellel
• 1-(1-r)m = 1-(1-0.5)2 = 1- 0.52= 0.75
• s32 ={(0.54, 85), (0.648, 100)}
78
Reliability Design
• Add 3 devices of type D3
• Reliability of 3 devices of type D3 connected in
parellel
• 1-(1-r)m = 1-(1-0.5)3 = 1- 0.53= 0.875
• s33 ={ (0.63, 105) }
• s3 ={(0.36, 65), (0.432, 80), (0.54, 85),
(0.648, 100)}(0.63, 105) }
• Optimal - (0.648, 100)
• 1,2,2 79
Reliability Design- Example 2
• C1=40 c2=25 c3=35
• r1=0.75 r2=0.85 r3=0.6
• u1=3 u2=3 u3=2
• c=175
80
• Till now we have seen how dynamic
programming problem could be applicable to
subset selection problem
81
Traveling Salesperson Problem
• Statement: Let G(V,E) be a directed graph with
edge cost cij.
• cij >0, and cij =∞ if <i, j> ɇ E.
• A tour of G is directed simple cycle that
includes every vertex in V.
• Cost of tour is sum of cost of edges on the
tour
82
Traveling Salesperson Problem
• Traveling salesman problem is to find tour of
minimum cost.
83
84
85
86
Traveling Salesman Problem
• g(2,Φ)=C21=5
• g(3,Φ)=C31=6
• g(4,Φ)=C41=8
97
98