0% found this document useful (0 votes)
179 views

Greedy Algorithm DAA

Greedy algorithms make locally optimal choices at each step to arrive at a global solution. They are easy to implement but do not always yield an optimal solution. The activity selection problem involves scheduling non-overlapping activities using a shared resource. The greedy algorithm for this problem sorts activities by finish time and sequentially selects the activity that does not conflict with previously chosen activities. A recursive formulation of this greedy algorithm selects the first compatible activity and recursively processes the remaining activities.

Uploaded by

Aryan Neelam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
179 views

Greedy Algorithm DAA

Greedy algorithms make locally optimal choices at each step to arrive at a global solution. They are easy to implement but do not always yield an optimal solution. The activity selection problem involves scheduling non-overlapping activities using a shared resource. The greedy algorithm for this problem sorts activities by finish time and sequentially selects the activity that does not conflict with previously chosen activities. A recursive formulation of this greedy algorithm selects the first compatible activity and recursively processes the remaining activities.

Uploaded by

Aryan Neelam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

AS which

CHAPTER-13

GreedyAl#orithms

13.1. GREEDY ALGORITHMS

Greedy algorithms are simple and straightforward. They are shortsighted in their approach
inthe sense that they take decisions on the basis ofinformation at hand without worrying
to
about the effect these decisions may have in the future. They are easy to invent, easy
implement and most of the time quite efficient.
Greedy algorithms donot alwaysyield optimal solution. But sometimestheywhen do. We will
algorithms. Even greedy
givesome examples of problems that can be solved by greedy heuristics (non
algorithms do not produce the optimal solution, they often provide fast
optimal solution strategies),are often used in finding good approximations.
13.1.1 Characteristics and Features of Problems Solved by Greedy Algorithms
To construct the solution in an optimal way. Algorithm maintains two sets. One
contains
chosen items and theother contains rejected items.
The greedy algorithm consists of four (4) functions.
1. A function that checks whether chosen set of items provide a solution.
2. A function that checks the feasibility of a set.
3. The selection function tells which of the candidates is the most promising.
4. An objective function, which does not appear explicitly, gives the value of a solution.
13.1.2. .Greedy Algorithm Structure
Initially the set of chosen items isempty, i.e., solution set.
" At each step
-Item willbe added in a solution set by using selection function.
If the set would no longer be feasible
" Reject items under consideration (and is never consider again).
ELSE set is still feasible THEN
" Add the current item.
284
Greedy Algorlthms
285

13.1.3.
pefinitions of Feasiblity
Afeasible set (of candidates))is promising if it can be extended to produce not merely a
solution, but an optimal solution to the problem. In particular, the empty set is always
promisingwhy? (Because an optical solution always exists)
Unlike Dynamic Programming, which solves the subproblems bottom-up, agreedy strategy
usuallyprogressesin a top-down fashion, making one greedy choice after another, reducing
probemto a smaller one.
each
Greedy-ChoiceProperty
property" and "optimal substructure" are two ingredients in the
The "greedy-choice greedy strategy.
problemthatlend to a
Greedy-Choice Property
It says that a
globally optimal solution can be arrived at by making alocally optimal
choice.
PROGRAMMING
u92 GREEDY VS. DYNAMIC
optimization problems a solution using dynamic programming can be unnecessarily
For many greedy algorithm which in each step choosesthe locally best continuation.
costly. Simpler is agreedy method is that the computed global solution may not always be
The drawback of
optimal.
both build solution from a collection
6 Both techniques are optimization techniques, and
of choices of individual elements.
often leads to algorithms with
Dynamicprogramming is a powerful technique, but it
higher than desired running times.
but it is not as powerful
Greedy method typically leads to simpler and faster algorithms,
or as widely applicable as dynamic programming.
making its choices in a serial forward
" The greedy method computes its solution by choices.
fashion, never looking back or revising previous
" Dynamic programming computersits solution bottom up by synthesizing them from
possibilities and choices before it arrives the
smaller subsolutions, and by trying many
optimal set of choices.
if the Greedy method will lead to
" There is not a priori litmus test by which one can tell
an optimal solution.
called The Principle
By contrast, there is a litumus test for Dynamic Programming,
of Optimality.
ACTIVITY-SELECTION PROBLEM
13.3 AN
We are given S = {1, 2,
nl of n activities that are to be scheduled to use some resource,
two
where each activity i has its starting8 ine s; end nish time f with s. s . We sav that
non-interferringifftheir
t start-finish intervals do not overlap, more formally,
activities iandj are
a activities i andj are compatible if their time periods are disjoint.
. i.e.,two
286 Design and Analysis of Algorithms

For exanmple, the lectures those are to be given in a lecture hall, where the
leecture times
have been set up in advance. There is only one resource, s0 8Ome start and finish
Overlap, i.e., two lectures Cannot be given in the same room at the same time. times may
The activity-selection problem is the problem of selecting the largest set of
compatible activities. mutualy
Two activities are compatible, if their intervals don't overlap.
iow do we schedule the laryest number of activities on the resource?

EXAMPLE 13.1. Compatible activities

Compatible

5
6

8
Time

Fig. 13.1.

13.3.1 Greedy Activity Selection Algorithm


In this algorithm the activities are first sorted in an increasing order according to their
finishing time. With the greedy strategy we first select the activity with the smallest duration
f-s) and schedule it,then, we skip all activities that are not compatible with the current
selection. Next others activities are greedily selected by going down the list and by icking
activity that are compatible with the current selection and repeats this process until all the
activities are considered. To make the selection process faster, we assume that the activities
have been sorted by their finish times, that is,
isf,s... sf,
EXAMPLE 13.2. In thefigure below, activity1 isscheduled firs, Then activity 2 and
3are not compatible with activity l so we have skipactivities 2 and 3. Next activity
4 is scheduled. Itinterfere with activity5 and 6 so we have to skip activities 5 and
6. Finallyactivity 7 isscheduled, and it interferes with the remaining activity. The
final Max
Greedy Algorithms 287

Add 1:

2
3
4 Sched 1; Skip 2,3
5
5
6 6

8 8

Add 4:
Add
7: Sched 4 ;;Skip 5,6

Sched 7; Skip 8
41 5

8
8

Fig. 13.2.

recursive greedy algorithm


provides a Straightforward,recursive
A RECURSIVE-ACTIVITY-SELECTOR
monotonically increasing finish
The procedure ordered by
n input activities are
solution. It assumes that the
time.
RECURSIVE-ACTIVITY-SELECTOR (S, f, i, i)
1. mi+ 1 S;;
find the first activity in
2. While m<jand s < f
3 do m m+ 1

4. if m<j RECURSIVE-ACTIVITY-SELECTOR(S, f, m, j)
5. then return fa) U
6. else return

How it works:
RECURSIVE-ACTIVITY-SELECTOR (s, f, 0, n+1). starting
" The initial call is
representedas arrays s,fand
" It takes the start and finish times ofthe
activities,
indices i andjof the subproblem Sij
mutuallycompatible activities in i
eournS a maximum-size set of
288 Design and Analysis of Algorithms
Analysis:
The sorting part can be as small asO (n log n)and the other part is O(n), so the total is
O(n log n).
Ainteractive greedy algorithm
The RECURSIVE-ACTIVITY-SELECTOR procedure is almost "tail" recursive". Tail
recursive procedure can be transformed in to an iterative form by astraightforward m
manner.
lIt also assumes that the input activities are ordered by monotonically increasing finish
time. It collect selected activities into a set Aand returns this set when it is done.
GREEDY-ACTIVITY-SELECTOR (S, f)
1. n lenght [s]
2. A {a,}
3. iE 1
4. for m 2 to n
5. do if sm f
6. then A AU{a,n)
7.
8. return A

EXAMPLE 13.3. Following are the 9 activities with their startand finish time as
S= <l, 2, 3, 4, 5, 6,7, 8, 9>
Si = <1, 2, 4, 1, 5, 8, 9, 11, 13>
Fi <3, 5,7, 8, 9, 10, 11, 14, 16>
Find aschedule where the maximum number of activities takes place.
Solution: Ssorted byfinish time:
i1 2 3 4 5 6 7 9
S; 1 2 4 1 5 8 9 11 13
f3 5 7 8 9 10 11 14 16

dg

ag? dg
0 1 2 4 5 6 7 8 10 11 12 13 14 15 16
Fig. 13.3.
Groedy Algorithms
289
Maximum-size mutually compatibel set; la,,
Thisis not the only optimal schedule. la, az, ay, ag) is also optimal.
13.4. KNAPSACK PROBLEM
Athiefrobbing a store and can carry a maximal weight of wintotheir knapsack. There are
ns and th item weight w, and is worth v, dollars. What items should thief take?
There are two versions of problem
1.Fractional knapsack problem
Mhe setup is same, but the thief can take fractions ofitems,
meaning that the items can
be broken into smaller pieces sothat thief may decide tocarry only a fraction ofx,of item i,
where 0Sx, S1.
" Exhibit greedy choice property:
-Greedy algorithm exists.
Exhibit optimal substructure property:
- Dynamic programming algorithm exists.
2. 0-1 knapsack problem
The step is the same, but the items may not be
decide either to take an item or to leave it (binarybroken into smaller pieces, so thief may
item.
choice), but may not take a fraction of
"Exhibit No greedy choice
property:
- Nogreedy algorithm exist,
"Exhibt optical
substrucute property:
Only dynamic programming algorithm exists.
13.4.1 Fractional Knapsack Problem
In the fractional knapsack
fraction of the weight and a problem the thief is allowed to take any
fraction of the fraction of an item for a
value.
To sove the fractional
problem, we first compute the value per pound vJw,
According to the greedy strategy, for each item.
with the greatest value per pound.theIf thief begins by taking as much as possible of the item
carry more, he takes as much as the supply of that item is
and so forth unitl he possible of the item with thenextexhausted
greatest
and he can still
can't carry any more. value per pound,
The fractional
problem is not. Knapsack problem is solvable by a greedy strategy, whereas the 0-1
EXAMPLE 13.4. There are 3 items as
followS,
Item 1weights 10 pounds and is
worth 60dollars.
ltem 2 weight 20 pounds andis worth
100
Item 3 weight 30 pounds and is worth dollars.
120 dollars.
290 Design and Analysls of Algorithms a

The knapsack can hold 50 pounds.

item 3

50
item 2
30
item 1
20
10

knapsack $60 $100 $120

Fig. 13.4.
Solution: The values per pound of items are as follows.
items W P=vlw,
items 1 10 $60 6.0

item 2 20 $ 100 5.0


item 3 30 $ 120 4.0

For the fractional knapsack problem,taking the items in order of greatest value per
pound yields an optimalsolution.

20 $80
30

item 3
50
item 2
20 $100
30
item 1 +
20
10 10 $60

knapsack $60 $100 $120 = $240


p=6.0 5.0 4.0
Greedy solution to fractional problem.
Fig. 13.5.
13.4.2 0-1 Knapsack Problem
The 0-1knapsack problemn is as follows. Athief robbingastore finds n items; the ith
item is worth v, dollars and weight w, pounds, where v, and w, are integers. He
wantS 0
take as valuable a load as p0ssible, but he can carry at most Wpounds in his knapsack (bag)
for some integer W.
Greedy Algorithms 291
he take? In such a way that ench item must either he taken or left,
itemsshould
hich take a feactional amount of an item and cAnnot take an item more than
thiefCannot
called be 0-1 knapsack problem because each item must he left (0) or taken
the
Thisis knapsack problem is hard to solve, and in fact it is an NP-complete
(). The 0-1
entinly
nblom.

greedystrategy does not work for the 0-1 hnapsack problem


The
13.5. Thereare 5 items as follows,
EXAMPLE
weighs 65pounds and is worth 30 dollars.
ItemI
weighs 10 pounds and is worth 20 dollars.
ltem2
weighs 20 pounds and is worth 100 dollars.
Item 3
pounds and is worth 90 dollars.
ltem4 weighs 30
pounds and is worth 160 dollars
Item 5 weighs 40
Knapsackcan hold 60 pounds.
The.

60

40
30
20
10

knapsack $30 $20 $100 $90 $160


p=6.0 2.0 5.0 3.0 4.0

Fig. 13.6.
Solution: The values per pound of items are as follows:

items P=vw,
item 1 5 30 6.0

item 2 10 20 2.0

item 3 20 100 5.0


item 4 30 90 3.0
item 5 40 160 4.0
Thus, the value per pound of item 1is 6 dollars per pound, which is greater than the
value per pound of item 2(2 dollars per pound), item 3(5 dollars per pound), item 4(3
Alakearsitemper1first.,pound)andorthen
itemtake
5 (4item
dollars per pound).
3, and item 4, which greedy strategy, therefore would
"The does not yield an optimal solution.
However, the optimal solution takes items 3and 5.
292 Design And Analysis of Algorithms

30| $90 40 $160

60

40
20 $100
30
4 20 $100
20
10 5 $30
5
$220 $280
knapsack $30 $20 $100 $90 $160
4.0 Greedy solution Optimal solution
p=6.0 2.0 5.0 3.0
to 0-1 problem to 0 -1 problem

Fig. 13.7.
the 0-1 knapsack problem
Hence we can say that greedy strategy does not work for
13.5. HUFFMAN CODES
compressing data Huffman
Huffman codes provide a method of encoding data efficiently. For greedy algorithm looks at
codes are widely used and very effective technique. Huffman's
in an optimal way.
the occurrence of each character and it as a binary string
wewant
EXAMPLE 13.6. Suppose we have a data consists of 100,000characters that
following frequencies.
to compress. The characters in the data occur with
C f
Frequency 45,000 13,000 12,000 16,000 9,0005,000
is
Conisder the problem of designing a "binary character code in which each character
represented by a unique binary string.
Fixed Length Code
Infixed length code, needs 3bits to represent six (6) characters.

b C f
Frequency 45,000 13,000 12,000 16,0009,000 5,000
Fixed Length 000 001 010 011 100 101
Code

This method require 3,000,000bits to code the entire file as


Total number of characters are 45,000 +13,000 +12,000 +16,000 +9,000 +5,000
= 1000,000.
" Add each character is assigned 3-bit codeword ’ 3*1,000,000 =3,000,000 bits.
Greedy Agorithms 293
Variable-lengthcode
Avariable-length code can do better by giving frequency characters short codewords
infrequent characters long codewords.
and
C d f
Frequency 45,000 13,000 12,000 16,000 |9,0005,000
Fixed Length 0 101 100 111 1101 1100
Code

Character 'a' are 45,000


each character a' assigned 1 bit codeword.
1* 45,000= 45,000bits.
Characters (b,c, d) are 13,000 + 12,000+ 16,000 = 41,000
each character assigned 3 bit codeword
3*41,000 = 123,000 bits.
14,000
" Characters (e,f) are 9,000 + 5,000 =
each character assigned 4 bit codeword.
4* 14,000 = 56,000 bits.
= 224,000 bits.
Implies that the total bits are: 45,000 + 123,000 + 56,000
Conclusion
224,000 bits, i.e., saving
Fixed-length code requires 300,000 bits while variable code requires
ofapproximately 25%.
Prefix Codes:
prefix of another one. The reason
Acode is called a prefix (free) code if no codeword is a
(compression) and decoding.
prefix codes are desirable is that they simply encoding
code.
EXAMPLE 13.7. (a = 0, b= l10, c = 10, d= 111) is a prefix
can be described by a binary tree in
Solution: It is noticeable that any binary prefix coding branch means "0" and a right branch
and left
which the codewords are the leaves of the tree, the tree. The code given in above
The length of a codeword is just its depth in
means "1".
a prefix code, and itscorresponding binary tree is shown in the following figure.
example is

10

b d

110 111
Fig. 13.8.
294 Design and Analysis of Algorithrns
Expected encoding length
of bits
Given a tree T corresponding to a prefix code, to compute the numberdenote
encode afile is very simple. For each character cinthe alphabet C, letf(c) therequired to
frequencybits
ofc in the file and let dke) denote the depth of c's leaf in the tree. The expected number of
needed to encode a file withn characters is given in the following formula:

B (T) = cEC
Ef(cd, tc)
Constructing a Huffman Code
Huffnan invented a greedy algorithm that constructs an optimal prefix code called a
Initially
Huffman code. We are buildinga bottom tree, starting from the leaf.
up there are
ntrees in the forest,as each tree is a leaf. Aceording to the greedy strategy we have to first
findthe two trees with minimum frequencies. Then merge these two trees in a single tree
where the frequency of this tree is the total sum of two merged trees. We repeat whole
process until there is only one tree in the forest.
At each step the contents of the queue must be sorted into increasing order by frequency
Thetwo trees with lowest frequencies are merged at each step. The procedure for
a Huffman code is given below: constructing
HUFFMAN (C)
1. n |C|
2. Q C
3. for i¢ 1 to n - 1
4. do allocate a new node z
5. left [z] t X+ EXTRACT-MIN (Q)
6. RIGHT [Z] y EXTRACT-MIN (Q)
7. f [z] f [x] + f [y]
8. INSERT (Q, z)
9. return Extract-MIN (Q) D Return the root of the tree.
The total running time of HUFFMAN on a set of n characters is O(n lg n).
EXAMPLE13.8. What is an optimal Huffman code for the following set offrequencies
a: 05, b: 48, C: 07, d: 17, e: 10, 13.
Solution: Smallest

a: 05 c: 07 e: 10 f: 13 d: 17 b: 48

Smallest
e: 10 12 |f: 13 d: 17 b: 48

a: 05 C: 07
Smallest 295
f: 13 di 17 22
b: 48

e: 10

a: 05
C: 07

Smallest
30 b: 48
0
0

è: 10 e: 10
d: 17
1
0

a: 05 C: 07

Smallest

b: 48

30
1

e: 10 12 f: 13 d: 17

a: 05 c: 07

(100)

b: 48 52

30

1 0

f: 13 d: 17
e: 10

a: 05 C: 07

Fig. 13.9.
296 Design and Analysis of Algorithms8
13.6 A TASK-SCHEDULING PROBLEM
The scheduling ofjobs on aBingle proceA8or with deadlíne constraints is named
scheduling problem. a8
" We schedule njob8 on a processor in a sequence to obtain maximum
task.
deadlines, the scheduling problem is stated as follows: profit subject to
" There is a set of n, jobs each jobi has a dead line d, 20 anda profit P, 20.
" For any ith job, the profit P, is earned ifjob is completed by its deadline.
" To complete a job, it is processed on a processor for one unit of time.
" Onlyone processor isavailable for processing all jobs.
" Not all jobs have to be scheduled.
A
feasible solution for this problem is a subset of jobs such that each job in this
can be completed by its deadline. suhsot
" The value of a feasible solution is the sum of the profits of the jobs in
subset.
" An optimal solution is a feasible solution with maximum profit.
" We adopt a greedy algorithm to determine how the next job is
solution.
selected for an optimal
" The algorithm is as follows:
Step 1: Sort P, into decreasing order. After sorting P, 2P, 2P .. P,
Step 2:Add the next /th job to the solution set if ih job can be
d,.Assign /th job to the kth vacant time slot in the array of completed by its deadline.
kth time slot is empty. If th time slot is not vacant, then solution set. If array [k] = 0 then
so on meeting its deadline constraint.
search the preceding slot (k - 1) and
Step 3: Stop if all jobs are examined otherwise go to step 2.
The greedy method described above always given an optimal
problem. solution to the job scheduling
Time complexity : 0(n)
EXAMPLE 13.9. Let n =4 (P1, P2, P3, P4) = (100, 10, 15,27) and (d1, d2, d3,
1, 2, 1) where p, are profits on d4) = (2,
Find the
processes or job and d, are dedline of
completion.
optimal schedule.
Solution:

Job (i) Pi d operation


Araay [1- 4] initialized Profit
to 0
1 100 2
Assign job 1 to slot 2 0100 100
4 27 1
Assign job 4 to slot 1 410|0 127
15 2
Reject job 3 because it |410o| 127
violates its deadline
Greedy Algorithms 297

10 1 Reject job 2 because it 4|1|00 127


violates its deadline
herefore optimal job sequence is
4’ 1 with maximum profit 127

SOLVED EXAMPLES

NAMPLE 13.10. What is greedy algorithm? Write its Pseudecode. Apply greedy
algorithm on colouring the vertices of graph. (UPTU, MCA 2003-04)

Fig. 13.10.
Solution: Refer sections 13.1 and 13.3 the given graph is

Fig. 13.11.
We want to colored the node of graph G by using the minimum number of colours in
such a way that no two adjacent nodes have the same colour.
By using greedy algorithm, we can get good results by choosing one colour and colouring
as many vertices as possible with that colour before going on the another colour. We proceed
with the next colour in the same way, not going to a third color until there are no vertices
that can be coloured with the second colour. Repeat this process until all the vertices of
Graph have not to be coloured.
We can coloured the vertices of given graph by using two colours as therefore only two
colours are require for proper colouring of vertices of given graph.
R

R G R

Fig. 13.12.
298 Deaign andAnalysls of Algorithms
EXAMPLE 13.11. Teo sorted files containing n and mrecords respectively coould be
merged together to obtain one sorted file in time O(n + m). When more than
sorted fles are to be merged together the merge can be accomplished by
repeatedly
merging sorted files pair. Thus if files *, *, S, *, are to be merged, we can hoi
following strategies:

OR

Yo

Fig. 13.13.
Given n sorted files. There are many ways in which pairwise merge then into n
single sorted file. Different pairing require differeing amount of computing time.
Design a greedy strategy algorithm for merging the files. Use your algorithms to
obtain optional mnerge pattern for three files of length 30, 20, 10, records.
(UPTU, MCA 2002-03)
Solution: The greedy strategy that could be used for merging the n files (sorted) to get one
sorted fle can be given as:
1. Sort the files according to their size in a non-decreasing order.
2. Select the 2 files with minimum number of records, i.e., the files to be extracted in
order.
3. Keep merging and built the tree.
For example : 3files with record size of 50, 30, 10
1. Sort : 10, 30, 50
2. Merging total sorted records

(90

40 50

(30)

Fig. 13.14.
Greedy Algorithms 299
pXAMPLE
E13.12. Suppose you are given a set S =la, a, .... a,) of tashs; where each
until time to proceas, You have once computer on whlch to run
taxkand computer can run only one task at a time. Each task a, earns a
hese iin
pflp, t completes processing no later than its deadline d, Develop a greedy
seorithmto schedule as many task as possible to earn maximum profit. Run your
algorithmfor n = 5and
following values. (UPTU, MCA, 2006-06)
1 2 4 5
Profit p;5020 15 3045
Deadline d, 2 1 2 6

decreasing order of profit, we get


Solution:Sortthe tasks in
1 4 2

Pr ofit p 50 45 302015
Deadline d 2 3 6 2

operation Araay [1- 5] initialized Profit


Task (i) P:
to 0
000 0 0

0 0 0 0
Assign task 1 to slot 2 50
1 50 2 4 5
1 2 3

Assign task 5 to slot 3


0 1 5 95
5 45 3 3 4 5
0 1 2

Assign task 4 to slot 5


15 0 4 125
4 30 6 4 5
2 3

2 1 5 0
1 Assign task 2 to slot 1 145
2 20 3 4 5
0 1

Reject task 3 because


2 15 4
145
15 2 5
0 1 2 3 4
it violets its dead-line

Therefore the optimal task sequence is


1’5’ 4’2 with maximum profit 145.

EXAMPLE 13.13. Prove that matroids exhibit the greedy choice property.
(UPTU, B.Tech. 2003-04)

Solution:
is
Suppose that M = (S, )is a weighted matroid with weight function w and that S
element S such
sorted into monotonically decreasing order by weight. Let x be the first
an optimal
that (x) is independent, if any such x exists. If xexists, then there exist
subset A of S that contains x.
300 Design and Analysls of Algorithms
and we
If no such x exists, then the only independent subset is the empty set other
xe B: are done..
Otherwise, let Bbe any nonempty optional subset. Assume that
let A=Band we are done. wise, we
No element of Bhas weight greater than ox). Tosee this, observe that y e Bimplies
that (yl is independent, since Be l and l is hereditary. Our choice ofx therefore ennsures
that o(r) 2 o(r) for any y e B.
choice of x, Ais
Construct the set as follows, begin with A= x). By the
Using the exchange property Auntil |A| = |B| while preserving the independenceindependent,
A. Then A=B= (y) u x) for some ye B and so
oA) =oB) oy) + ox)
2 oB)
Because Bisoptimal, Amust also optimal and because x e A. Hence, it is proved
EXAMPLE 13.14.What is greedy algorithm write its pseudocode prove that the fractiongl
knapsack problem has a greedy choice property. (UPTU, B.Tech. 2004-05)
Solution: Refer sections 13.1l and 13.3
the fractional knapsack problem has a greedy-choice property.
Athief robbing a store finds n items-the ith item is worth V; dollars,weight W, pounds
where u, and o, are integers. He wants to take as valuable a load as possible, but he can
carry at most Wpound in his knapsack or bag.
In the fractional knapsack problem, the thief can take fraction of items, to solve the
problem,we first comput the value per pound u/o, for each item. Obeying a greedy technique.
the thief begins by taking as much as possible of the item with the greatest value per pound.
If the supply of that item is exhausted and he can still carry more he takes as much as
possible of the item with next greatest value per pound, until he cannot carry any more.
For example,

item 3
50
item 2

30
item 1
20
10

knapsack $60 $100 $120

Fig. 13.15.
then the greedy solution to fractional problem is as follows
Greedy Algorlthms
301

20
30 $80

20 $100

10 $60
= $ 240

Fig. 13.16.
Hence, the items taking in order of greatest value per pound yields an optimal solution.
Thus, it has been proved that the fractional knapsack problem has an greedy choice
property.

EXAMPLE 13.15. What is an optimal Huffman code. for thefollowing set offrequencies,
bnsed on the first 8 Fibonacci number?
a: 1b: 1, c: 2, d: 3, e: 5, ff 8, g: 13 h: 21
ean vou generalize your answer to find the optimal code when the frequencies are
the first in Fibonacei numbers?
Solution

-Smallest

a: 1 b:1 C: 2 d:3 e: 5 f:8 g: 13 |h: 21

Smallest

c:2 | d: 3 e:5 f:8 9: 13 h: 21

a: 1 b:1

Smallest

d: 3 |f: 8
4 e: g: 13 |h: 21

c:2

1
a: 1 |b: 1
302 Design and Analysls of Algorlthms
-Smallest

|t:8 lg :13 |h:21

d: 3
0

c:2

0 1

a: 1 b:1
-Smallest

f: 8 g:13
E
h:21

e: 5

d: 3

C: 2

a: 1 | b:1
-Smallest

lg:13 20 h:21
1

e:5

f: 8

d:3

E c:2

a: 1 b:1
mSmallest
h:21 33

9:13

e:5

f: 8
1

d:3

c:2

a: 1 b: 1

h:21 33

20
g:13

e:5
1

f: 8
0

d:3
1

c:2

a: 1 b: 1

Fig. 13.17.
Design and Analysis of Algorithms
304
EXAMPLE 13.16. Give an efieient greedy algorithm. that finds an optimal
covers for a tree in linear time. vertex
Solution: " To construct an optimalvertex cover for a tree.
" Pick a node Vwhich has at least one leaf.
" Select Vand remnove Vall edges and vertices incident to Vunitl upto no
are left. vertices
" Edges incident to leaf Vto be covered.
Now, wecan find optimally by selecting Vand algorithm find the optimal cover
EXAMPLE 13,17.Soloe the instance ofthe scheduling problemgiven below and find the optimal
schedule for the tasks with given weights (penalties) and dead lines.
i |1 2 3 4 5 6 7

d; 4 4 1 6
W; 70 60 50 40 30 20 10

Solution: Sort the tasks in decreasing order by penalties so that minimum of penalties wil
be charged.
i1 2 3 5 6 7

d; 4 2 4 1 4 6
w; 70 60 50 40 30 20 10

Here the number of tasks are 7 so we create Array [1 .....7]

Task(i)) w; operation |Araay [l-1] initialized Penalty


|to 0

00 0 0 0

0 1 0
1 70 4 Assign Task 1 to slot 4 0
0 1 2 3 4 5 6 7

2 60 2 Assign Task 2 to slot 2


0
2 01000
0 1 2 3 4 5 6

3 50 4 Assign Task 3 to slot 3 0 2 3 1 0 0


0 1 2 3 5 6 7
because 4th glot is
not vacant

4 40 Assign Task 4 to Slot 1


4 2 3 1 0 0
0 1 2 3 4 5 6 7
because 3rd, 2nd, Slot
are not vacant
Greedy Algorithms 305

4 2 3 10
1
Rejected task B 30
30 1 2 4 5 7
5 because it violets its
deadline (i.e., Slot 1
is already fill)

Reject Task 6 because


4 2 33 1 30 + 20
20 4 5 6
6 0 1 4 7
it violets its deadline - 50
(i.e., Slots 4, 3, 2, 1
are already fill))
2 3 1 7 0 0
10
Assign task 7 to Slot 5 50
7 1 2 3 4 5 6 7
because 5th Slot is
vacant

The tasks 5 and 6 are rejected so Penalty is 50.

EXERCISES

1 Explain greedy algroithm and write its complexity.


2 How can differciate greedy algorithm with dynamic programming?
3 Consider the following instances of the kanpsack problemn =3, m=20 (P, P, P)= (25, 24, 14)
and (w, w,, w)= (18, 15, 10).
4 Construct a huffman tree corresponding to following set of data.
C d e f
Frequency (in thousands) 48 12 10 15 8 5
000 001 010 011 100 101
Fixed length code word
Variable length coding 0 101 100 111 1101 1100

You might also like