0% found this document useful (0 votes)
6 views

Algorithm Class Lecture 3

Algorithm Class lecture 3

Uploaded by

minjoopjjm
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Algorithm Class Lecture 3

Algorithm Class lecture 3

Uploaded by

minjoopjjm
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

CSE 347 Lecture 3

Dynamic Programming

1
We cannot find a good Greedy Choice…

What should we do?


Consider all choices!

2
Example: Edit Distance
Input: 2 sequences of some character set, e.g.
𝑆=ABCADA
𝑇=ABAD C
Goal: Compute the minimum number of insertions or
deletions you could do to convert 𝑆 into 𝑇
We will call it 𝐸𝑑𝑖𝑡 𝐷𝑖𝑠𝑡𝑎𝑛𝑐𝑒(𝑆[1 … 𝑛], 𝑇[1 … 𝑚]),
where 𝑛 and 𝑚 be the length of 𝑆 and 𝑇 respectively.

3
Idea: compute difference between the
sequences
𝑆 =ABCADA

𝑇 =ABAD C
𝐸𝑑𝑖𝑡 𝐷𝑖𝑠𝑡𝑎𝑛𝑐𝑒(𝑆, 𝑇) =

4
Greedy Choice does not work!

5
Branching Algorithm

Example:
𝑆 =AB CADA

𝑇 =ABAD C

6
Example of subproblem organization
{A, B, C}, {D, B, C}

7
Correctness Proof Outline
Greedy Choice Property
Complete Choice Property: The optimal solution makes one
of the choices that we consider
Inductive Structure: Once you make any choice, you are left
with a smaller problem of the same type. Any first choice +
feasible solution to the subproblem = feasible solution to the
entire problem.
Optimal Substructure: If we optimally solve the subproblem
for a particular choice 𝒄, and combine it with 𝑐, resulting
solution is the optimal solution that makes choice 𝒄.

8
Correctness Proof
Claim: For any problem 𝑃, the branching algorithm finds the
optimal solution.
Proof: Induct on problem size
Base Case: 𝑆 = 0 or 𝑇 = 0, obvious
Inductive Case:

9
It is correct, but what about
complexity?
Let’s assume that 𝑛 = 𝑚
𝑇(𝑛, 𝑛) =

𝑇 𝑛, 𝑛 = 10
How could we reduce the complexity?
There are overlapping subproblems that we compute
more than once! Number of distinct subproblems is
polynomial, we can share the solutions that we have
already computed! What about this:

11
How could we reduce the complexity?

# subproblems:
Store it in a 2D table of size approximately 𝑛×𝑚, indexed by 𝑖 and 𝑗

What is the above pseudocode missing?

12
Dynamic Programming Table

13
Algorithm
Base Case(s):

Compute Rule:

Order of computation:

14
Corrected Algorithm
Base Case(s):

Compute Rule:

Order of computation:

15
How to compute the table?

A B C A D A
A
B
A
D
C

16
What is the new runtime?
𝑇 𝑛, 𝑚 =

17
Example 2: Weighted Interval
Scheduling (IS)
Input:
𝑃 = {𝑝# , 𝑝$ , … , 𝑝% }
𝑝& = {𝑠& , 𝑓& , 𝒘𝒊 }
𝑠& is the start time, 𝑓& is the finish time
𝑤& is the weight of the task for job 𝑖
Goal: Pick a set of non-overlapping intervals Π such
that ∑(!∈* 𝑤& (sum of weights) is maximized

18
Idea of solving the problem
The interval now is sorted arbitrarily. Look at the last
interval 𝑝%

𝐼𝑆 𝑝!, 𝑝", … , 𝑝#
𝐼𝑆 𝑝!, 𝑝", … , 𝑝#$! , if 𝑝# is not picked
= 𝑚𝑎𝑥 3
𝑤# + 𝐼𝑆 𝑃 − intervals overlapping with 𝑃# , if you pick 𝑝#

19
Decision Visualization
𝐼𝑆 𝑝!, 𝑝", … , 𝑝#
𝐼𝑆 𝑝!, 𝑝", … , 𝑝#$! , if 𝑝# is not picked
= 𝑚𝑎𝑥 3
𝑤# + 𝐼𝑆 𝑃 − intervals overlapping with 𝑃# , if you pick 𝑝#

Π% = Best Solution for all intervals up to n.


𝑃#^ = Problem without 𝑃# and all intervals that overlap with 𝑃# .
𝐼𝑆 𝑃# = max{𝑤' + 𝐼𝑆 𝑃#^ , 𝑃#$!}
Problem: How could we organize the subproblem 𝑃## ?
20
Again, we can order by finish time!
How does that help us work on the problem?
2 5
1
3
4

𝐼𝑆 𝑝!, … 𝑝( =

21
Table Organization

𝐼𝑆[𝑛] depends on

Time Complexity:
22
Conclusions
• Dynamic Programming is a very powerful algorithmic design
technique.
• Idea: You can not find a greedy choice, but even when you
consider all possible choices, the total number of problems is
polynomial in size.
• Hallmark: Subproblems are used over and over again so the
order of computation is important.
• The important consideration is how to organize the solutions to
subproblems --- generally in some sort of table --- so you can
access them when you need them.

23

You might also like