0% found this document useful (0 votes)
2 views46 pages

Week9_GreedyAlgorithms

The document discusses greedy algorithms, focusing on their application to optimization problems such as activity selection, job scheduling, and interval partitioning. It explains the concept of making locally optimal choices in hopes of achieving a globally optimal solution, while also highlighting that greedy algorithms do not always guarantee optimal results. Additionally, it includes proofs of optimality for various greedy algorithm applications, emphasizing the importance of making choices that do not rule out optimal solutions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views46 pages

Week9_GreedyAlgorithms

The document discusses greedy algorithms, focusing on their application to optimization problems such as activity selection, job scheduling, and interval partitioning. It explains the concept of making locally optimal choices in hopes of achieving a globally optimal solution, while also highlighting that greedy algorithms do not always guarantee optimal results. Additionally, it includes proofs of optimality for various greedy algorithm applications, emphasizing the importance of making choices that do not rule out optimal solutions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 46

BMI2224 Algorithms

Week 9:Greedy Algorithms

Asst. Prof. Dr. Arzum Karataş

2025-Spring
Recall: Last Lecture
• Dynamic programming is applied to optimization problems. Such
problems can have many possible solutions. Each solution has a
value, and we wish to find a solution with the optimal (minimum or
maximum) value.

•A dynamic-programming algorithm solves each subsubproblem


just once and then saves its answer in a table, thereby avoiding the
work of recomputing the answer every time it solves each
subsubproblem.
2
Recall: Last Lecture(2)
Some problems solved with dynamic programming
• Rod Cutting
• Assembly Line Scheduling
• Longest Common Subsequence
• Matrix Chain Multiplication

3
Agenda
• Three examples of greedy algorithms:
• Activity Selection
• Job Scheduling
• Interval Scheduling
• Interval Partitioning
• Huffman Coding(later we will discuss)

4
Greedy Algorithms
Similar to dynamic programming, but simpler approach
◦ Also used for optimization problems

Idea: When we have a choice to make, make the one that looks
best right now
◦ Make a locally optimal choice in hope of getting a globally optimal solution

Greedy algorithms don’t always yield an optimal solution

Makes the choice that looks best at the moment in order to get
optimal solution.
Activity Selection Problem

6
Example: Activity Selection
CS 161 Class

Math 51 Class
CS 161 Sleep
Section
Frisbee Practice
You can only do
CS 161 Office one activity at a time, and you want to
maximize
Hours the number of activities that youTheory
do. Seminar
Orchestra
Programming What to choose?
team meeting CS 166 Class
Underwater basket
CS110 weaving class CS161 study
Class group Swimming
lessons
Theory Lunch
Combinatorics
Seminar Social activity

time
Activity Selection Problem
• Input:
• Activities a1, a2, …, an
• Start times s1, s2, …, sn
• Finish times f1, f2, …, fn

• Output:
• How many activities can you do today?
Greedy Algorithm for Activity Selection

a5
a1 a3
a7
a4
a2 a6
time
• Pick the activity you can add
• that has the smallest finish time.
• Include it in your activity list.
• Repeat.
Greedy Algorithm for Activity Selection(1)

a5
a1 a3
a7
a4
a2 a6
time
• Pick the activity you can add
• that has the smallest finish time.
• Include it in your activity list.
• Repeat.
Greedy Algorithm for Activity Selection(2)

a5
a1 a3
a7
a4
a2 a6
time
• Pick the activity you can add
• that has the smallest finish time.
• Include it in your activity list.
• Repeat.
Greedy Algorithm for Activity Selection(3)

a5
a1 a3
a7
a4
a2 a6
time
• Pick the activity you can add
• that has the smallest finish time.
• Include it in your activity list.
• Repeat.
Greedy Algorithm for Activity Selection(4)

a5
a1 a3
a7
a4
a2 a6
time
• Pick the activity you can add
• that has the smallest finish time.
• Include it in your activity list.
• Repeat.
Greedy Algorithm for Activity Selection(5)

a5
a1 a3
a7
a4
a2 a6
time
• Pick the activity you can add
• that has the smallest finish time.
• Include it in your activity list.
• Repeat.
Greedy Algorithm for Activity Selection(6)

a5
a1 a3
a7
a4
a2 a6
time
• Pick the activity you can add
• that has the smallest finish time.
• Include it in your activity list.
• Repeat.
Greedy Algorithm for Activity Selection(7)

a5
a1 a3
a7
a4
a2 a6
time
• Pick the activity you can add
• that has the smallest finish time.
• Include it in your activity list.
• Repeat.
That seems like a reasonable
thing to do…
• Running time:
• O(n) if the activities are already sorted by finish
time.
• Otherwise O(nlog(n)) if you have to sort them
first.
This is an example of a greedy
algorithm

• At each step in the algorithm, make a choice.


• Hey, I can increase my activity set by one,
• And leave lots of room for future choices,
• Let’s do that and hope for the best!!!

• Hope that at the end of the day, this results in a


globally optimal solution.
Recall: Sub-problem graph view
• Divide-and-conquer:

Big problem

sub-problem sub-problem

sub-sub- sub-sub- sub-sub- sub-sub-


problem problem problem problem
Recall:Sub-problem graph view(2)
• Dynamic Programming:

Big problem

sub-problem sub-problem sub-problem

sub-sub- sub-sub- sub-sub- sub-sub-


problem problem problem problem
Recall:Sub-problem graph view(3)
• Greedy algorithms:

Big problem

sub-problem

sub-sub-
problem
Recall:Sub-problem graph view(4)
• Greedy algorithms:

Big problem

• Not only is there optimal sub-structure:


• optimal solutions to a problem are made up
from optimal solutions of sub-problems
sub-problem

• but each problem depends on only one


sub-problem.

sub-sub-
problem
What have we learned?
• In order to come up with a greedy algorithm, we:
• Made a series of choices

• Proved that our choices will never rule out an optimal


solution.
• Conclude that our solution at the end is optimal.
Let’s see a few more examples
25
Interval Scheduling(2)

26
27
28
29
Interval Scheduling
• Proof of optimality by contradiction
➢ Suppose for contradiction that greedy is not optimal
➢ Say greedy selects jobs 𝑖1, 𝑖2, … , 𝑖𝑘 sorted by finish time
➢ Consider an optimal solution 𝑗1, 𝑗2, … , 𝑗𝑚 (also sorted by finish time) which
matches greedy for as many indices as possible
o That is, we want 𝑗1 = 𝑖1, … , 𝑗𝑟 = 𝑖𝑟 for the greatest possible 𝑟
➢ Both 𝑖𝑟+1 and 𝑗𝑟+1 must be compatible with the previous selection (𝑖1 = 𝑗1, …
, 𝑖𝑟 = 𝑗𝑟)

30
Interval Scheduling
• Proof of optimality by contradiction
➢ Consider a new solution 𝑖1, 𝑖2, … , 𝑖 𝑟 , 𝑖𝑟+1, 𝑗𝑟+2, … , 𝑗𝑚
o We have replaced 𝑗𝑟+1 by 𝑖𝑟+1 in our reference optimal solution
o This is still feasible because 𝑓𝑖𝑟+1 ≤ 𝑓𝑗𝑟+1 ≤ 𝑠𝑗𝑡 for 𝑡 ≥ 𝑟 + 2
o This is still optimal because 𝑚 jobs are selected
o But it matches the greedy solution in 𝑟 + 1 indices
• This is the desired contradiction

31
Interval Scheduling
• Proof of optimality by induction
➢ Let 𝑆𝑗 be the subset of jobs picked by greedy after considering the first 𝑗
jobs in the increasing order of finish time
o Define 𝑆0 = ∅
➢ We call this partial solution promising if there is a way to extend it to
an optimal solution by picking some subset of jobs 𝑗 + 1, … , 𝑛
o ∃𝑇 ⊆ {𝑗 + 1, … , 𝑛} such that 𝑂𝑗 = 𝑆𝑗 𝖴 𝑇 is optimal
➢ Inductive claim: For all 𝑡 ∈ {0,1, … , 𝑛}, 𝑆𝑡 is promising
➢ If we prove this, then we are done!
o For 𝑡 = 𝑛, if 𝑆𝑛 is promising, then it must be optimal (Why?)
o We chose 𝑡 = 0 as our base case since it is “trivial”

32
Interval Scheduling
• Proof of optimality by induction
➢ 𝑆𝑗 is promising if ∃𝑇 ⊆ {𝑗 + 1, … , 𝑛} such that 𝑂𝑗 = 𝑆𝑗 𝖴 𝑇 is optimal
➢ Inductive claim: For all 𝑡 ∈ {0,1, … , 𝑛}, 𝑆𝑡 is promising
➢ Base case: For 𝑡 = 0, 𝑆0 = ∅ is clearly promising
o Any optimal solution extends it
➢ Induction hypothesis: Suppose the claim holds for 𝑡 = 𝑗 − 1 and optimal
solution 𝑂𝑗−1 extends 𝑆𝑗−1
➢ Induction step: At 𝑡 = 𝑗, we have two possibilities:
1) Greedy did not select job 𝑗, so 𝑆𝑗 = 𝑆𝑗−1
• Job 𝑗 must conflict with some job in 𝑆𝑗−1
• Since 𝑆𝑗−1 ⊆ 𝑂𝑗−1, 𝑂𝑗−1 also cannot include job 𝑗
• 𝑂𝑗 = 𝑂𝑗−1 also extends 𝑆𝑗 = 𝑆𝑗−1

33
Interval Scheduling
• Proof of optimality by induction
➢ Induction step: At 𝑡 = 𝑗, we have two possibilities:
2) Greedy selected job 𝑗, so 𝑆𝑗 = 𝑆𝑗−1 𝖴 𝑗
• Consider the earliest job 𝑟 in 𝑂𝑗−1 ∖ 𝑆𝑗−1
• Consider 𝑂𝑗 obtained by replacing 𝑟 with 𝑗 in 𝑂𝑗−1
• Prove that 𝑂𝑗 is still feasible
• 𝑂𝑗 extends 𝑆𝑗, as desired!

Greedy selects job 𝑗

𝑆𝑗−1 𝑗

𝑆𝑗−1 𝑟 𝑂𝑗−1 ∖ 𝑆𝑗−1


Earliest job in 𝑂𝑗−1 ∖ 𝑆𝑗−1

34
Contradiction vs Induction
• Both methods make the same claim
➢ “The greedy solution after 𝑗 iterations can be extended to an optimal
solution, ∀𝑗”

• They also use the same key argument


➢ “If the greedy solution after 𝑗 iterations can be extended to an optimal
solution, then the greedy solution after 𝑗 + 1 iterations can be extended
to an optimal solution as well”

➢ For proof by induction, this is the key induction step


➢ For proof by contradiction, we take the greatest 𝑗 for which the greedy
solution can be extended to an optimal solution, and derive a
contradiction by extending the greedy solution after 𝑗 + 1 iterations

35
Let’s see a Interval Partitioning

36
Interval Partitioning

• Problem
➢ Job 𝑗 starts at time 𝑠𝑗 and finishes at time 𝑓𝑗
➢ Two jobs are compatible if they don’t overlap
➢ Goal: group jobs into fewest partitions such that
jobs in the same partition are compatible.

37
Interval Partitioning
• Think of scheduling lectures for various courses into as
few classrooms as possible
• This schedule uses 4 classrooms for scheduling 10
lectures

38
Interval Partitioning

• Think of scheduling lectures for various courses into as


few classrooms as possible
• This schedule uses 3 classrooms for scheduling 10
lectures

39
Interval Partitioning
• Let’s go back to the greedy template!
➢ Go through lectures in some “natural” order
➢ Assign each lecture to an (arbitrary?) compatible classroom, and
create a new classroom if the lecture conflicts with every existing
classroom

• Order of lectures?
➢ Earliest start time: ascending order of 𝑠𝑗
➢ Earliest finish time: ascending order of 𝑓𝑗

➢ Shortest interval: ascending order of 𝑓𝑗 − 𝑠𝑗


➢ Fewest conflicts: ascending order of 𝑐𝑗, where 𝑐𝑗 is the number of
remaining jobs that conflict with 𝑗

40
Interval Partitioning

• At least when you assign


each lecture to an
arbitrary compatible
classroom, three of these
heuristics do not work.

• The fourth one works!


(next slide)

41
Interval Partitioning

42
Interval Partitioning
• Running time
➢ Key step: check if the next lecture can be scheduled at some
classroom
➢ Store classrooms in a priority queue
o key = latest finish time of any lecture in the classroom
➢ Is lecture 𝑗 compatible with some classroom?
o Same as “Is 𝑠𝑗 at least as large as the minimum key?”
o If yes: add lecture 𝑗 to classroom 𝑘 with minimum key, and
increase its key to 𝑓𝑗
o Otherwise: create a new classroom, add lecture 𝑗, set key to 𝑓𝑗
➢ 𝑂(𝑛) priority queue operations, 𝑂(𝑛 log 𝑛) time

43
Interval Partitioning
• Proof of optimality (lower bound)
➢ # classrooms needed ≥ “depth”
o depth = maximum number of lectures running at any time
o Recall, as before, that job 𝑖 runs in 𝑠𝑖 , 𝑓𝑖
➢ Claim: our greedy algorithm uses only these many classrooms!

44
Interval Partitioning
• Proof of optimality (upper bound)
➢ Let 𝑑 = # classrooms used by greedy
➢ Classroom 𝑑 was opened because there was a lecture 𝑗 which was
incompatible with some lectures already scheduled in each of 𝑑 − 1
other classrooms
➢ All these 𝑑 lectures end after 𝑠𝑗
➢ Since we sorted by start time, they all start at/before 𝑠𝑗
➢ So, at time 𝑠𝑗, we have 𝑑 mutually overlapping lectures
➢ Hence, depth ≥ 𝑑 = #classrooms used by greedy ∎
➢ Note: before we proved that #classrooms used by any algorithm
(including greedy) ≥ depth, so greedy uses exactly as many
classrooms as the depth.

45
46

You might also like