0% found this document useful (0 votes)
7 views

Dynamic Programming-Unit-4.ppt

The document discusses Dynamic Programming as an algorithm design method for solving problems that involve a sequence of decisions, such as the knapsack problem and job sequencing. It contrasts Dynamic Programming with the Greedy Method, highlighting how the former generates multiple decision sequences while avoiding suboptimal ones. Additionally, it covers the multistage graph problem, resource allocation, and the All-Pairs Shortest Paths problem, providing examples and methods for calculating costs and optimal paths.

Uploaded by

Aditya Hogade
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

Dynamic Programming-Unit-4.ppt

The document discusses Dynamic Programming as an algorithm design method for solving problems that involve a sequence of decisions, such as the knapsack problem and job sequencing. It contrasts Dynamic Programming with the Greedy Method, highlighting how the former generates multiple decision sequences while avoiding suboptimal ones. Additionally, it covers the multistage graph problem, resource allocation, and the All-Pairs Shortest Paths problem, providing examples and methods for calculating costs and optimal paths.

Uploaded by

Aditya Hogade
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 98

Computer Algorithms

Unit-4

Dynamic Programming

1
The General Method
• An algorithm design method that can be used
• When the solution to a problem can be viewed
• as the result of a sequence of decisions.
• Step wise decisions can be made for the
problem like – knapsack problem, job
sequencing with deadline, optimal merge
pattern etc.
• Which can be solved using Greedy Approach
2
The General Method
• For some problems it is not possible to make
a sequence of stepwise decisions, which will
give optimal decision sequence.
• Solution to such problems is – to try all
possible decision sequences
• Enumerate all decision sequences and then
pick out best.
• But the time and space requirement may be
prohibitive.
3
The General Method
• Dynamic Programming drastically reduces
the amount of time and space required.
• It avoids the enumeration of some decision
sequences , that cannot possibly be optimal

4
Difference between Greedy Method
and Dynamic Programming
• In Greedy Method, only one decision
sequence is generated.
• In Dynamic Programming, many decision
sequences can be generated
• But sequences containing suboptimal
subsequences can not be optimal
• And so will not be generated

5
Multistage Graph
• A multistage graph G=(V, E) is a directed
graph, in which
• The vertices are partitioned into k ≥ 2 disjoint
sets Vi, 1≤i<k
• Sets V1 and Vk are such that │V1│ = │Vk│ =1
• Let s and t be the vertices in V1 and Vk
respectively.
• s is source and t is sink (destination)
6
Multistage Graph
• Let c(i, j) be the cost of edge (i, j)
• The cost of a path from s to t is the sum of the
costs of all the edges on the path
• The multistage graph problem is to find a
minimum-cost path from s to t
• Each stage Vi defines a stage in the graph
• Constraint is, every path from s to t starts in
stage 1, goes to stage 2 and so on
• And terminates in k stage 7
The shortest path in multistage
graphs
• e.g.

• The greedy method can not be applied to this case:


(S, A, D, T) 1+4+18 = 23.
• The real shortest path is:
(S, C, F, T) 5+2+2 = 9.
8
Multistage Graph
• Dynamic Programming Approach
• Such problems can be solved using 2
approaches
• Forward Approach
• Backward Reasoning
• Backward Approach
• Forward Reasoning

9
Multistage Graph- Example

10
Multistage Graph- Example
• Every s to t path is the result of a sequence of
k-2 decisions
• i th decision involves determining which vertex
in Vi+1 is to be on the path
• cost(i, j)=min {c(j, l) + cost (i+1, l)}
• Cost(1,1)= min{9+cost(2,2), 7+cost(2,3),
3+cost(2,4), 2+cost(2,5)}
• Forward Approach - backward reasoning.
• Calculate Distance from Target as reference 11
Multistage Graph- Example
• Cost(1,1)= min{9+cost(2,2), 7+cost(2,3),
3+cost(2,4), 2+cost(2,5)}
• Cost(4,9)=c(9,12)+cost(5,12)
=4+0 = 4
• Cost(4,10)=c(10,12)+cost(5,12)
=2+0 = 2
• Cost(4,11)=c(11,12)+ cost(5,12)
=5+0 =5
12
Multistage Graph- Example
• cost(3,6) = min{6+cost(4,9), 5+cost(4,10)}
=min(6+4, 5+2) = min( 10,7)=7

• cost(3,7) = min{4+cost(4,9), 3+cost(4,10)}


=min(4+4, 3+2)=min(8,5)=5

• cost(3,8) = min{5+cost(4,10), 6+cost(4,11)}


=mint(5+2, 6+5)=min(7, 11)=7

13
Multistage Graph- Example
• cost(2,2)
=min{4+cost(3,6), 2+cost(3,7), 1+cost(3,8)}
= min(4+7, 2+5, 1+7) = 7

• cost(2,3)
=min{2+cost(3,6), 7+cost(3, 7)}
=min(2+7, 7+5)= 9

14
Multistage Graph- Example
• cost(2,4)
=min{11+cost(3,8)}
=min(11+7)=18

• cost(2,5)
=mint{11+cost(3,7), 8+cost(3,8)}
=min(11+5, 8+7)=15

15
Multistage Graph- Example

16
Multistage Graph- Example
• Cost(1,1)
= min{9+cost(2,2), 7+cost(2,3), 3+cost(2,4),
2+cost(2,5)}
=min(9+7, 7+9, 3+18, 2+15)
=16
• A minimum cost s to t path has a cost 16
• This path can be determined easily if we
record the decision made at each state
(vertex) 17
Multistage Graph- Example
• Let d(i, j) be the value of l (l is a node in next
level) that minimizes {c(j, l) + cost (i+1, l)}

• d(3,6)=10 d(3,7)=10 d(3,8)=10


• d(2,2)=7 d(2,3)=6 d(2,4)=8
• d(2,5)=8 d(1,1)=2 or 3

18
Multistage Graph- Example
• Let the minimum-cost path be
• s,v2,v3,..vk-1,t
• V2=d(1,1)=2 v3=d(2,2)=7
• v4=d(3,7)=10
• So the path is 1, 2, 7, 10, 12
• Cost= 16

19
Multistage Graph- Example
• Let the minimum-cost path be
• s,v2,v3,..vk-1,t
• V2=d(1,1)=3 v3=d(2,3)=6
• v4=d(3,6)=10
• So the path is 1, 3, 6, 10, 12
• Cost= 16

20
Multistage Graph- Example

21
Multistage Graph- Backward Approach
• backward Approach - Forward reasoning.
• Calculate Distance from Source as reference
• bcost(i, j) = min(bcost(i-1, l)+c(l, j))
• bcost(5,12)= min{4+cost(4,9), 2+cost(4,10),
5+cost(4,11)}

22
Multistage Graph- Example

23
Multistage Graph- Backward Approach
• bcost(i, j) = min(bcost(i-1, l)+c(l, j))
• bcost(2, 2)=9
• bcost(2, 3)=7
• bcost(2, 4)=3
• bcost(2, 5)=2
• bcost(3, 6)= min{ bcost(2, 2)+c(2, 6)
bcost(2, 3)+c(3, 6)}
=min{9+4, 7+2} = 9
24
Multistage Graph- Backward Approach
• bcost(3, 7)=11
• bcost(3, 8)=10
• bcost(4, 9)=15
• bcost(4, 10)=14
• bcost(4, 11)=16
• bcost(5, 12)=16

25
Multistage Graph
• Example-
• Consider a resource allocation problem
• n units of resource are to be allocated to r
projects
• If j, 0≤ j ≤ n, units of the resource are allocated
to project i,
• Then the resulting net profit is N(i, j)

26
Resource allocation problem
• The problem is to allocate the resource to the
r projects
• in such a way as to maximize total net profit
• This problem can be formulated as an r+1
stage graph problem
• Stage i, 1≤ i ≤ r represents project i
• There are n+1 vertices V(i, j), 0≤ j ≤ n
associated with stage i, 2≤ i ≤ r
27
Resource allocation problem

28
All-Pairs Shortest Paths (APSP)
• Let G=(V,E) be a directed graph with n
vertices.
• Let cost be a adjacency matrix for G, such that
cost(i, i)=0 ,1≤ i ≤ n
• cost(i, j) is the length (or cost) of edge (i,j)
if (i, j) Є E(G)

29
All-Pairs Shortest Paths (APSP)
• All pair shortest path problem is to determine a
matrix A
• such that A(i, j) is the length of shortest path
from i to j.
• A can be obtained by solving n Single source
shortest path problems
– Complexity n*n2 i.e. O(n3)

30
All-Pairs Shortest Paths (APSP)
Shortest path from i to j, i≠j
• Path may go through some intermediate vertex
• If k is the index of this intermediate vertex then
path (i,k) and (k,j) must be shortest path
• If k is the intermediate vertex with highest index
then
– i to k path is the shortest i to k path going
through no vertex with index greater than k-1
– So is the k to j path
• We need to decide highest index intermediate
vertex k
31
All-Pairs Shortest Paths (APSP)
•Let Ak(i, j) be length of shortest path from i to j
going through no vertex of index greater than k

•A( i , j )=min{ min {Ak-1(i, k)+ Ak-1(k, j)}, cost(i,j)}


1≤k≤n
• A shortest path from i to j may or may not go
through highest index vertex
•Ak( i , j )=min{ Ak-1(i, j), Ak-1(i, k)+Ak-1(k, j)}
•K ≥ 1

32
33
All-Pairs Shortest Paths (APSP)

1
4 2

2
3 11

4 4
3
17

34
All-Pairs Shortest Paths (APSP)
• A0
1 2 3 4
1 0 4 11 ∞
2 6 0 ∞ 2
3 3 ∞ 0 17
4 ∞ ∞ 4 0

35
All-Pairs Shortest Paths (APSP)
• A1
1 2 3 4
1 0 4 11 ∞
2 6 0 17 2
3 3 7 0 17
4 ∞ ∞ 4 0

36
All-Pairs Shortest Paths (APSP)
• A2
1 2 3 4
1 0 4 11 6
2 6 0 17 2
3 3 7 0 9
4 ∞ ∞ 4 0

37
All-Pairs Shortest Paths (APSP)
• A3
1 2 3 4
1 0 4 11 6
2 6 0 17 2
3 3 7 0 9
4 7 11 4 0

38
All-Pairs Shortest Paths (APSP)
• A4
1 2 3 4
1 0 4 10 6
2 6 0 6 2
3 3 7 0 9
4 7 11 4 0

39
Optimal Binary Search Tree
• Given set of identifiers, different binary trees
could be formed.
• Average number of comparisons needed for
finding the identifiers will be different for different
trees.
• In the simplest form we assume
– Probability of search of each element is equal
– No unsuccessful searches made
40
Optimal Binary Search Tree

41
Optimal Binary Search Tree
• In general situation different identifiers can be
searched with different probabilities
• and unsuccessful searches can also be made
• Let set of identifiers be {a1,a2, ..,an} with
a1 <a2 < .. < an
• Let p(i) be the probability with which we
search ai.

42
Optimal Binary Search Tree
• Let q(i) be the probability with which we search
x such that ai < x < ai+1, , 0≤i≤n.
– Assume a0= -∞ and an+1 = ∞.
• Then
• Σ q(i) is the probability of unsuccessful search
0≤i≤n

• Clearly
• Σ p(i) + Σ q(i) =1
1≤i≤n 0≤i≤n

43
Optimal Binary Search Tree

44
Optimal Binary Search Tree
•(a1, a2, a3) = (do, if, int)
• If all searches (successful and un-successful)
equally probable (1/7)
• the cost will be
•Cost(tree A)=1/7+2/7+3/7+1/7+2/7+3/7+3/7
•=15/7
Cost(B)=13/7
Cost(C)=15/7
Cost(D)=15/7
Cost(E)=15/7
▪Tree B is optimal 45
Optimal Binary Search Tree
•(a1, a2, a3) = (do, if, int)
•p(1)=0.5, p(2)=0.1, p(3)=0.05 and
• q(0)=0.15, q(1)=0.1, q(2)=0.05, q(3)=0.05
• the costs will be
•cost(A)=((0.5 X 3) + (0.1 X 2)+ (0.05 X 1)) +
((0.15 X 3)+(0.1 X 3)+ (0.05 X2)+(0.05 X1))
•= 1.5+0.2+0.05+0.45+0.3+0.1+0.05
•= 2.65
46
Optimal Binary Search Tree
•cost(B)=((0.5 X 2)+(0.1 X 1)+(0.05 X 2)) +
((0.15 X 2)+(0.1 X 2)+ (0.05 X2)+(0.05 X2))
=1.0+0.1+0.1+0.30+0.2+0.1+0.1
=1.9

•cost(C)=((0.5 X 1)+(0.1 X 2)+(0.05 X 3)) +


((0.15 X1)+(0.1 X 2)+ (0.05 X3)+(0.05 X3))
=0.5+0.2+0.15+0.15+0.2+0.15+0.15
=1.5 47
Optimal Binary Search Tree
•cost(D)=((0.5 X 2)+(0.1 X 3)+(0.05 X 1)) +
((0.15 X 2)+(0.1 X 3)+ (0.05 X3)+(0.05 X1))
=1.0+0.3+0.05+0.30+0.3+0.15+0.05
=2.15

•cost(E)=((0.5 X 1)+(0.1 X 3)+(0.05 X 2)) +


((0.15 X1)+(0.1 X 3)+ (0.05 X3)+(0.05 X2))
=0.5+0.3+0.10+0.15+0.3+0.15+0.10
=1.6 48
Optimal Binary Search Tree
• p(1)=0.5, p(2)=0.1, p(3)=0.05 and
• q(0)=0.15, q(1)=0.1, q(2)=0.05, q(3)=0.05
• the costs will be
•cost(A)=2.65
•cost(B)=1.9
•cost(C)=1.5
•cost(D)=2.15
•cost(E)=1.6
• Tree C is optimal 49
Applying Dynamic Programming Approach
• Construct tree as a result of sequence of
decisions
• Make decision as to which of the ai should be
assigned to the root node of the tree
• If we choose ak as a root, then
• Internal nodes a1, a2,…, ak-1 and external nodes
E1, E2,…, Ek-1 will lie in left sub-tree l of root
• The remaining nodes will lie in right sub-tree r
50
Applying Dynamic Programming Approach

Root of sub-tree is
assumed to be present at
level 1
51
Applying Dynamic Programming Approach

• If the tree is optimal, then this value must be


minimum
• Hence cost(l) and cost(r) must be minimum
52
Applying Dynamic Programming Approach
• If we use c(i, j) to represent the cost of an
optimal binary search tree tij containing ai+1,
ai+2,…, aj and Ei, Ei+1,…, Ej
• Then for the tree to be optimal
• cost(l)=c(0, k-1) and cost(r)=c(k, n)
• k must be chosen such that

is minimum
53
Applying Dynamic Programming Approach
• Hence for c(0, n) we obtain

• We can generalize it to obtain for any c(i, j)

54
Applying Dynamic Programming Approach
• This equation can be solved for c(0, n)
• By first computing all c(i, j), such that j-i =1
• c(i, i)=0 w(i, i)=q(i) 0≤i≤n
• Next computing all c(i, j), such that j-i =2 and
then computing all c(i, j), such that j-i =3 so on
• During this computation we record the root
r(i, j) of each tree tij ,
• then optimal binary search tree can be
constructed from these r(i, j) 55
56
57
Applying Dynamic Programming Approach
• w(i, j)= p(j)+q(j)+w(i, j-1)

• w(0, 2) =p(2)+q(2)+w(0,1)
=3+1+8
=12
c(0,2)=min{c(0,0)+c(1,2), c(0,1)+c(2,2)}+w(0,2)
=min{0+3, 7+0}+12
=3+12= 15
r(0, 2) = 1; 58
Applying Dynamic Programming Approach
• w(i, j)= p(j)+q(j)+w(i, j-1)

• w(1, 3) =p(3)+q(3)+w(1,2)
=1+1+7
=9
c(1,3)=min{c(1,1)+c(2,3), c(1,2)+c(3,3)}+w(1,3)
=min{0+3, 7+0}+9
=3+9= 12
r(1, 3) = 2; 59
Applying Dynamic Programming Approach
• w(i, j)= p(j)+q(j)+w(i, j-1)

• w(2, 4) =p(4)+q(4)+w(2,3)
=1+1+3
=5
c(2,4)=min{c(2,2)+c(3,4), c(2,3)+c(4,4)}+w(2,4)
=min{0+3, 3+0}+5
=3+5= 8
r(2, 4) = 3; 60
Applying Dynamic Programming Approach
w(0, 3)=p(3)+q(3)+w(0,2)
=1+1+12
=14
c(0,3)=min{c(0,0)+c(1,3), c(0,1)+c(2,3),
c(0,2)+c(3,3)} +w(0,3)
=min{0+12, 8+3, 19+0}+14
=11+14= 25
r(0, 3) = 2;
61
Applying Dynamic Programming Approach
w(1, 4)=p(4)+q(4)+w(1,3)
=1+1+9
=11
c(1,4)=min{c(1,1)+c(2,4), c(1,2)+c(3,4),
c(1,3)+c(4,4)} +w(1,4)
=min{0+8, 7+3, 12+0}+11
=8+11= 19
r(1, 4) = 2;
62
Applying Dynamic Programming Approach
w(0, 4)=p(4)+q(4)+w(0,3)
=1+1+14
=16
r(0,4)=min{c(0,0)+c(1,4), c(0,1)+c(2,4),
c(0,2)+c(3,4), c(0,3)+c(4,4)} +w(0,4)
=min{0+19, 8+8, 19+3, 25+0 }+16
=16+16= 32
r(0, 4) = 2;
63
64
65
0 / 1 Knapsack

66
0 / 1 Knapsack
• If Si+1 contains two pairs (Pj, Wj) and (Pk, Wk)
with the values Pj ≤ Pk and Wj ≥ Wk
• Then pair (Pj, Wj) can be discarded
• It is called Dominance Rule/ Purging Rule.

67
68
0 / 1 Knapsack

Generate the sets Si , 0 ≤ i ≤ 4, when


( w1, w2, w3, w4) = (10, 15, 6, 9) and
( p1, p2, p3, p4) = (2, 5, 8, 1)
m=25
69
70
71
72
73
74
Reliability Design
• s0 ={(1, 0)} – Initial Set Before adding any device
• Add 1 device of type D1
• s11 ={(0.9, 30)}
• Add 2 devices of type D1
• Reliability of 2 devices of type D1 connected in
parellel
• 1-(1-r)m = 1-(1-0.9)2 = 1- 0.12= 0.99
• s12 ={(0.99, 60)}
• s1 ={(0.9, 30), (0.99, 60)}
75
Reliability Design
• Add 1 device of type D2
• s21 ={(0.72, 45), (0.792, 75)}
• Add 2 devices of type D2
• Reliability of 2 devices of type D2 connected in
parellel
• 1-(1-r)m = 1-(1-0.8)2 = 1- 0.22= 0.96
• s22 ={(0.864, 60), (0.9504, 90)}

76
Reliability Design
• Add 3 devices of type D2
• Reliability of 3 devices of type D2 connected in
parellel
• 1-(1-r)m = 1-(1-0.8)3 = 1- 0.23= 0.992
• s23 ={ (0.8928, 75) }
•s2 ={ (0.72, 45), (0.792, 75), (0.864, 60), (0.9504,
90), (0.8928, 75) }
•s2 ={(0.72, 45), (0.864, 60), (0.8928, 75) }
77
Reliability Design
• Add 1 device of type D3
• s31 ={(0.36, 65), (0.432, 80), (0.4464, 95)}
• Add 2 devices of type D3
• Reliability of 2 devices of type D3 connected in
parellel
• 1-(1-r)m = 1-(1-0.5)2 = 1- 0.52= 0.75
• s32 ={(0.54, 85), (0.648, 100)}

78
Reliability Design
• Add 3 devices of type D3
• Reliability of 3 devices of type D3 connected in
parellel
• 1-(1-r)m = 1-(1-0.5)3 = 1- 0.53= 0.875
• s33 ={ (0.63, 105) }
• s3 ={(0.36, 65), (0.432, 80), (0.54, 85),
(0.648, 100)}(0.63, 105) }
• Optimal - (0.648, 100)
• 1,2,2 79
Reliability Design- Example 2
• C1=40 c2=25 c3=35
• r1=0.75 r2=0.85 r3=0.6
• u1=3 u2=3 u3=2
• c=175

80
• Till now we have seen how dynamic
programming problem could be applicable to
subset selection problem

• Now will see some permutation problem with n!


complexity

81
Traveling Salesperson Problem
• Statement: Let G(V,E) be a directed graph with
edge cost cij.
• cij >0, and cij =∞ if <i, j> ɇ E.
• A tour of G is directed simple cycle that
includes every vertex in V.
• Cost of tour is sum of cost of edges on the
tour

82
Traveling Salesperson Problem
• Traveling salesman problem is to find tour of
minimum cost.

Traveling Salesman Problem: Applications

• Postal van picking mails from postal


mail boxes.

83
84
85
86
Traveling Salesman Problem
• g(2,Φ)=C21=5
• g(3,Φ)=C31=6
• g(4,Φ)=C41=8

• g(2, {3})= C23 + g(3,Φ) = 9+6=15


• g(2, {4})= C24 + g(4,Φ) = 10+8=18
• g(3, {2})= C32 + g(2,Φ) = 13+5=18
• g(3, {4})= C34 + g(4,Φ) = 12+8=20
• g(4, {2})= C42 + g(2,Φ) = 8+5=13 87
Traveling Salesman Problem
• g(2, {3, 4})=
• = min {C23 + g(3,{4}), C24 + g(4,{3})}
• = min {9+20, 10+15} = min{29, 25} =25

• g(3, {2, 4})=


• = min {C32 + g(2,{4}), C34 + g(4,{2})}
• = min {13+18, 12+13} = min{31, 25} =25

• g(4, {2, 3})=


• = min {C42 + g(2,{3}), C43 + g(3,{2})}
88
Traveling Salesman Problem
• g(1, {2, 3, 4})=
• = min {C12 + g(2,{3, 4}),
C13 + g(3,{2, 4}),
C14 + g(4,{2, 3})}
• = min {10+25, 15+25, 20+23}
• = min{35, 40, 43}
• =35
v1= 1 v2= 2 v3=4 v4= 3 v5=1
• 1,2,4,3,1 89
90
91
92
93
94
95
96
Flow Shop Scheduling

97
98

You might also like