0% found this document useful (0 votes)
20 views3 pages

Algos HW4

The document discusses several greedy algorithms and their optimality. It analyzes greedy algorithms for interval scheduling, job assignment, coin change, and interval covering problems. It provides examples where greedy algorithms fail to find optimal solutions and discusses alternative dynamic programming approaches.

Uploaded by

vharinirao
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views3 pages

Algos HW4

The document discusses several greedy algorithms and their optimality. It analyzes greedy algorithms for interval scheduling, job assignment, coin change, and interval covering problems. It provides examples where greedy algorithms fail to find optimal solutions and discusses alternative dynamic programming approaches.

Uploaded by

vharinirao
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

1) One approach to solve this efficiently is through a greedy algorithm.

a) Create a list of interval endpoints, pairing each endpoint with a flag to indicate
whether it's a left endpoint or a right endpoint.
b) Sort these endpoints in ascending order.
c) Initialize an empty set `Y` to store the intervals that will constitute the smallest
cover.
d) Iterate through the sorted endpoints:
i) - For each endpoint:
ii) - If it's a left endpoint and it's not covered by any interval in `Y`, add the
corresponding interval to `Y`.
iii) - If it's a right endpoint and it covers an interval in `Y`, remove that interval
from `Y`.
e) The set `Y` will contain the smallest cover for the given set of intervals.
f) The time complexity of this algorithm is O(n log n).
g) The correctness of the proof:
i) Can be proved through contradiction
(1) Suppose there exists an optimal solution where the algorithm's
choice at some step is not the rightmost endpoint. If we replace
this choice with the rightmost endpoint, it will either maintain or
decrease the size of the cover since the replaced interval covers
the same points as the previous one and possibly more.
Therefore, the greedy algorithm's choice of selecting the rightmost
endpoint at each step ensures that it constructs the smallest
cover.

2) One approach to solving this problem is through a greedy algorithm:


a) Sort both the job sizes (S) and machine cycles (P) arrays in ascending order.
b) Initialize the total mismatch to 0.
c) Iterate through the sorted job sizes and for each job, assign it to the machine with
the closest available cycles.
d) Code:
def minimize_mismatch(jobs, machines):
jobs.sort()
machines.sort()

total_mismatch = 0

for i in range(len(jobs)):
min_mismatch = float('inf')
min_idx = -1

for j in range(len(machines)):
mismatch = abs(jobs[i] - machines[j])
if mismatch < min_mismatch:
min_mismatch = mismatch
min_idx = j

total_mismatch += min_mismatch
del machines[min_idx]

return total_mismatch / len(jobs)


e) Proof of correctness :
i) Use proof by contradiction
(1) Suppose there exists an optimal assignment different from the one
given by the greedy algorithm. If the greedy algorithm's solution is
not optimal, it implies that there exists a better solution. However,
at each step, the greedy algorithm selects the best available
option, leading to a contradiction as there cannot be a better
solution.
f) Time complexity:
i) O(n^2) time

3a) This algorithm does not always provide an optimal solution.


a) counterexample:
i) collection A = {0.1, 0.4, 0.6, 0.9}. If the interval [0.5, 1.5] is chosen, it covers two
points. Then the remaining points {0.1, 0.4} will each need separate intervals,
resulting in a total of three intervals: [0.5, 1.5], [0.1, 1.1], and [0.4, 1.4]. Therefore,
in this case, the algorithm does not guarantee a minimum cardinality collection of
unit intervals.

3b) This algorithm does provide an optimal solution:


a) Proof of Correctness:
i) Choosing the Leftmost Point:
1) By choosing the smallest (leftmost) point aj in A, we ensure that the
interval [aj, aj + 1] covers this point and extends to the right by one unit,
thereby minimizing the interval's size.
ii) Covering Points:
1) The interval [aj, aj + 1] covers the leftmost point and potentially other
points to the right of aj within the range [aj, aj + 1]. This interval is the
smallest possible interval that covers aj.
iii) Recursion:
1) After adding [aj, aj + 1] to S, we recursively continue with the points in A
not covered by that interval. This process ensures that each step focuses
on the remaining uncovered points and selects the smallest interval to
cover the next leftmost uncovered point, reducing the total cardinality of
intervals used.
b) Can prove through Induction
1) Base Case: If A contains only one point, the algorithm trivially selects the
interval [aj, aj + 1], covering the single point in A.
2) Inductive Step: Assuming the algorithm correctly selects intervals for n-1
points, when considering the nth point, it picks the smallest interval [aj, aj
+ 1] to cover this point, ensuring the smallest possible size for the
intervals covering the entire collection A.

4a) Greedy Algorithm - Always subtract the largest coin:


a) This greedy algorithm does not always give the smallest number of coins.
i) counterexample
1) Consider the coin denominations: {1, 3, 4}. Target amount = 6. Greedy
approach: Start with the largest coin (4), subtract it once, then use two 1s.
The optimal solution for 6 would be two 3s, which uses fewer coins than
the greedy approach.

4b) If c [i] = b^(i- 1) for some integer b ≥ 2: The greedy algorithm of always subtracting the
largest coin still does not guarantee the smallest number of coins.
a) Counterexample :
i) Consider coin denominations: {1, 2, 4, 8}. Target amount = 11.
1) Greedy approach: Start with the largest coin (8), subtract it once, then use
one 2 and one 1. The optimal solution for 11 would be one 4 and one 2,
which uses fewer coins than the greedy approach.

4c) Efficient Algorithm using Dynamic Programming:


a) Initialize an array D of size T+1 to store the minimum number of coins needed for each
amount from 0 to T.
b) Set D[0] = 0 since the minimum number of coins needed to make 0 is 0.
c) Iterate from 1 to T, computing the minimum coins required for each amount using the
formula: D[i ] = min ⁡( D [ i - c [ j ] ] + 1 ) for j in [ 0 , n ) where c [ j ] ≤ i
d) Time Complexity:
i) The time complexity for this dynamic programming approach is O(T * n)

You might also like