0% found this document useful (0 votes)
40 views

Algorithm Design Paradigm-3

The document discusses several greedy algorithms including for scheduling problems, Huffman coding, and the knapsack problem. It explains that greedy algorithms work in phases by making locally optimal choices at each step in the hope of finding a global optimum. Strategic steps for developing greedy algorithms include determining optimal substructure, developing a recursive solution, and proving it is always safe to make the greedy choice.

Uploaded by

Yukti Satheesh
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views

Algorithm Design Paradigm-3

The document discusses several greedy algorithms including for scheduling problems, Huffman coding, and the knapsack problem. It explains that greedy algorithms work in phases by making locally optimal choices at each step in the hope of finding a global optimum. Strategic steps for developing greedy algorithms include determining optimal substructure, developing a recursive solution, and proving it is always safe to make the greedy choice.

Uploaded by

Yukti Satheesh
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Algorithm Design

Paradigm - 3

Dr. Sandeep Kumar Satapathy


School of Computer Sc. & Engg. (SCOPE)
VIT University, Chennai

Source: Introduction to Algorithm by CORMEN, LEISERSON, RIVEST and STEIN (3 rd Edition)


Algorithm Design and Application by GOODRICH and TAMASSIA (Wiley Publication)
Greedy Algorithms: Introduction
• An optimization problem is one in which you want to find, not just a
solution, but the best solution
• A “greedy algorithm” sometimes works well for optimization problems
• A greedy algorithm works in phases. At each phase:
• You take the best you can get right now, without regard for future
consequences
• You hope that by choosing a local optimum at each step, you will end
up at a global optimum
• Greedy algorithms do not always yield optimal solutions, but for many
problems they do.
Greedy Algorithms: Strategic Steps
1. Determine the optimal substructure of the problem.
2. Develop a recursive solution. (For the activity-selection problem, the
formulated recurrence has been bypassed developing a recursive
algorithm based on this recurrence.)
3. Show that if we make the greedy choice, then only one subproblem
remains.
4. Prove that it is always safe to make the greedy choice. (Steps 3 and 4
can occur in either order.)
5. Develop a recursive algorithm that implements the greedy strategy.
6. Convert the recursive algorithm to an iterative algorithm.
Greedy Algorithms: Introduction
• Algorithms to be solved by using Greedy Strategy
• Scheduling Algorithm
• Huffman Code
• Knapsack Problem
• Greedy Set Cover
Scheduling Algorithms: Greedy Approach
• Also called as Activity Selection Problem of scheduling several competing
activities that require exclusive use of a common resource, with a goal of
selecting a maximum-size set of mutually compatible activities.
• Suppose we have a set 𝑆 = {𝑎1 , 𝑎2 , … , 𝑎𝑛 } of 𝑛 proposed activities that wish to
use a resource, such as a lecture hall, which can serve only one activity at a
time.
• Each activity 𝑎𝑖 has a start time 𝑠𝑖 and a finish time 𝑓𝑖 , where 0 ≤ 𝑠𝑖 ≤ 𝑓𝑖 ≤ ∞.If
selected, activity 𝑎𝑖 takes place during the half-open time interval [𝑠𝑖 , 𝑓𝑖 ).
• Activities 𝑎𝑖 and 𝑎𝑗 are compatible if the intervals [𝑠𝑖 , 𝑓𝑖 ) and [𝑠𝑗 , 𝑓𝑗 ) do not
overlap. That is, 𝑎𝑖 and 𝑎𝑗 are compatible if 𝑠𝑖 ≥ 𝑓𝑗 or 𝑠𝑗 ≥ 𝑓𝑖 .
• In the activity-selection problem, we wish to select a maximum-size subset of
mutually compatible activities.
Scheduling Algorithms: Greedy Approach
• Assume that the activities are sorted in monotonically increasing order of finish
time: 𝑓1 ≤ 𝑓2 ≤ 𝑓3 ≤ ⋯ ≤ 𝑓𝑛−1 ≤ 𝑓𝑛
• Consider an example: For this example, the subset {𝑎3 , 𝑎9 , 𝑎11 } consists of
mutually compatible activities.
• It is not a maximum subset, however, since the subset {𝑎1 , 𝑎4 , 𝑎8 , 𝑎11 } is larger.
• In fact, {𝑎1 , 𝑎4 , 𝑎8 , 𝑎11 } is a largest subset of mutually compatible activities;
another largest subset is {𝑎2 , 𝑎4 , 𝑎9 , 𝑎11 }.
Scheduling Algorithms: Greedy Approach
• If we make the greedy choice, we have only one remaining subproblem to solve:
finding activities that start after 𝑎1 finishes.
• Why don’t we have to consider activities that finish before 𝑎1 starts? We have
that 𝑠1 ≤ 𝑓1, and 𝑓1 is the earliest finish time of any activity, and therefore no
activity can have a finish time less than or equal to 𝑠1 .
• Thus, all activities that are compatible with activity 𝑎1 must start after 𝑎1
finishes.
Scheduling Algorithms: Greedy Approach
Huffman Code
• Huffman codes compress data very effectively: savings of 20% to 90% are
typical, depending on the characteristics of the data being compressed.
• We consider the data to be a sequence of characters.
• Huffman’s greedy algorithm uses a table giving how often each character occurs
(i.e., its frequency) to build up an optimal way of representing each character as
a binary string.
Huffman Code
• We have many options for how to represent such a file of information.
• Here, we consider the problem of designing a binary character code (or code for
short) in which each character is represented by a unique binary string, which
we call a codeword.
• If we use a fixed-length code, we need 3 bits to represent 6 characters:
a = 000, b = 001, . . . , f = 101. This method requires 300,000 bits to code the
entire file.

Can we do better?
Huffman Code
• A variable-length code can do considerably better than a fixed-length code, by
giving frequent characters short codewords and infrequent characters long
codewords.
• Figure 16.3 shows such a code; here the 1-bit string 0 represents a, and the 4-
bit string 1100 represents f. This code requires
(45 x 1 + 13 x 3 + 12 x 3 + 16 x 3 + 9 x 4 + 5 x 4) x 1,000 = 224,000 bits to
represent the file, a savings of approximately 25%.
• In fact, this is an optimal character code for this file
Huffman Code: Greedy Approach
• Prefix Codes
• means the codes (bit sequences) are assigned in such a way that the code
assigned to one character is not the prefix of code assigned to any other
character.
• This is how Huffman Coding makes sure that there is no ambiguity when
decoding the generated bitstream in variable length coding.
• For example: Let there be four characters a, b, c and d, and their
corresponding variable length codes be 00, 01, 0 and 1.
• This coding leads to ambiguity because code assigned to c is the prefix of
codes assigned to a and b.
• If the compressed bit stream is 0001, the de-compressed output may be
“cccd” or “ccb” or “acd” or “ab”.
Huffman Code: Greedy Approach
• Huffman invented a greedy algorithm that constructs an optimal prefix code
called a Huffman code.
• Assume that C is a set of n characters and that each character
c ϵ C is an object with an attribute c.freq giving its frequency.
• The algorithm builds the tree T corresponding to the optimal
code in a bottom-up manner. It begins with a set of |C| leaves
and performs a sequence of |C|-1 “merging” operations to
create the final tree.
• The algorithm uses a min-priority queue Q, keyed on the freq
attribute, to identify the two least-frequent objects to merge
together.
• When we merge two objects, the result is a new object whose
frequency is the sum of the frequencies of the two objects
that were merged.
Huffman Code: Greedy Approach

You might also like