0% found this document useful (0 votes)
3 views

ALGO Dynamic Programmin Algorithm(Slide 1)

Dynamic programming (DP) is an optimization technique used for problems that satisfy the principle of optimality, where optimal decisions at each stage lead to an overall optimal solution. It contrasts with the greedy method, which may not yield the best solution in certain cases, particularly in multistage graphs. The advantages of DP include systematic problem-solving, avoiding exhaustive searches, and storing intermediate solutions for efficiency.

Uploaded by

ksthelion489
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

ALGO Dynamic Programmin Algorithm(Slide 1)

Dynamic programming (DP) is an optimization technique used for problems that satisfy the principle of optimality, where optimal decisions at each stage lead to an overall optimal solution. It contrasts with the greedy method, which may not yield the best solution in certain cases, particularly in multistage graphs. The advantages of DP include systematic problem-solving, avoiding exhaustive searches, and storing intermediate solutions for efficiency.

Uploaded by

ksthelion489
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

Dynamic Programming

Rifat Ahommed
Lecturer
CSE, SEU

1
Dynamic Programming (DP)
◼ Dynamic programming is typically
applied to optimization problems.

◼ Problems that can be solved by dynamic


programming satisfy the principle of
optimality.

P.2
Principle of optimality
◼ Suppose that in solving a problem, we have to
make a sequence of decisions D1, D2, …, Dn-1,
Dn

◼ If this sequence of decisions D1, D2, …, Dn-1, Dn


is optimal, then the last k, 1  k  n, decisions
must be optimal under the condition caused by
the first n-k decisions.

3
Dynamic method v.s.
Greedy method
◼ Comparison: In the greedy
method, any decision is locally
optimal.
◼ These locally optimal solutions
will finally add up to be a globally
optimal solution.

4
The Greedy Method
◼ E.g. Find a shortest path from v0 to v3.
◼ The greedy method can solve this problem.
◼ The shortest path: 1 + 2 + 4 = 7.

5
The Greedy Method
◼ E.g. Find a shortest path from v0 to v3 in the multi-stage
graph.

◼ Greedy method: v0v1,2v2,1v3 = 23


◼ Optimal: v0v1,1v2,2v3 = 7
◼ The greedy method does not work for this problem.
◼ This is because decisions at different stages influence
one another. 6
7
Multistage graph
◼ A multistage graph G=(V,E) is a directed
graph in which the vertices are partitioned
into k2 disjoint sets Vi, 1i  k
◼ In addition, if <u,v> is an edge in E then
uVi and vVi+i for some i, 1i<k
◼ The set V1 and Vk are such that V1
=Vk=1
◼ The multistage graph problem is to find a
minimum cost path from s in V1 to t in Vk
◼ Each set Vi defines a stage in the graph
8
Greedy Method vs. Multistage
graph
4

E.g.
A D
◼ 1 18
11 9

2 5 13
S B E T
16 2

5
C 2
F

◼ The greedy method cannot be applied to this


case: S A D T 1+4+18 = 23.
◼ The shortest path is:
S C F T 5+2+2 = 9.
9
Dynamic Programming
◼ Dynamic programming approach:
1 A
d(A, T)

2 d(B, T)
S B T

d(C, T)
5
C

◼ d(S, T) = min{1+d(A, T), 2+d(B, T),


5+d(C, T)}
10
Dynamic Programming
4
A D
1 18
11 9
4
A D
d(D, T) 2 5 13
S B E T
16 2
11
E T 5
d(E, T)
C 2
F

◼ d(A, T) = min{4+d(D, T), 11+d(E, T)}


◼ = min{4+18, 11+13} = 22.

11
Dynamic Programming
◼ d(B, T) = min{9+d(D, T), 5+d(E, T), 16+d(F, T)}
= min{9+18, 5+13, 16+2} = 18.
◼ d(C, T) = min{ 2+d(F, T) } = 2+2 = 4
◼ d(S, T) = min{1+d(A, T), 2+d(B, T), 5+d(C, T)}
= min{1+22, 2+18, 5+4} = 9.
4
A D
1 18 9 D
11 9 d(D, T)

2 5 13 5 d(E, T)
S B E T B E T
16 2

5 d(F, T)
16
C F F 12
2
The advantages of dynamic
programming approach
◼ To avoid exhaustively searching the entire solution
space (to eliminate some impossible solutions and
save computation).
◼ To solve the problem stage by stage
systematically.
◼ To store intermediate solutions in a table (array)
so that they can be retrieved from the table in
later stages of computation.

13
Comment
◼ If a problem can be described by a
multistage graph then it can be solved
by dynamic programming.

14
THANK YOU

15

You might also like