0% found this document useful (0 votes)
2 views

Printed File

The document discusses the Greedy Method in algorithm design, detailing its application in various problems such as the Knapsack Problem and Job Scheduling with Deadlines. It contrasts the Greedy Method with Dynamic Programming, highlighting their differences and approaches to problem-solving. Additionally, it covers Backtracking techniques and their applications, including the N Queens Problem and Graph Coloring.

Uploaded by

sairamnagarajan7
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Printed File

The document discusses the Greedy Method in algorithm design, detailing its application in various problems such as the Knapsack Problem and Job Scheduling with Deadlines. It contrasts the Greedy Method with Dynamic Programming, highlighting their differences and approaches to problem-solving. Additionally, it covers Backtracking techniques and their applications, including the N Queens Problem and Graph Coloring.

Uploaded by

sairamnagarajan7
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

DESIGN AND ANALYSIS OF ALGORITHM – GREEDY METHOD

Greedy Method-General Method:


 It is straightforward design technique and applied to a wide variety of problems.
 Most of these problems have n inputs and require us to obtain a subset that satisfies some constraints.
 Any subset that satisfies those constraints is called a feasible solution.
 We need to find a feasible solution that either maximizes or minimizes a given objective function. A
Feasible solution that does this is called an optimal solution.
 The greedy method suggests that one can devise an algorithm that works in stages, considering one input at
a time.
 A Greedy technique that will result in algorithm those general sub optimal solutions is called subset
paradigm.

Greedy Algorithm:

 The function Select selects an input from a[ ] and removes it.


 The selected input's value is assigned to x.
 Feasible is a Boolean-valued function that determines whether x can be included into the solution
vector.
 The function Union combines x with the solution and updates the objective function.
 Once a particular problem is chosen and the functions Select, Feasible and Union are properly
implemented.

Knapsack Problem:

 The greedy method is applied to solve the knapsack problem.


 We are given n objects and a knapsack or bag.
 Object i has a weight wi and the knapsack has a capacity m.
 If a fraction xi, 0 <xi < 1, of object i is placed into the knapsack, then a profit of pi * xi is earned.
 The objective is to obtain a filling of the knapsack that maximizes the total profit earned.
 Since the knapsack capacity is m, we require the total weight of all chosen objects to be at most m.

1
Formally the problem can be stated as:

Knapsack Problem-Example:

2
3
Types of Knapsack Problem:

4
Fractional Knapsack- Example Problem:

Item A B C D
Profit 280 100 120 120
Weight 40 10 20 24
Pi/ Wi 7 10 6 5
Arranging the above the tables with descending order of Pi/ Wi

Item B A C D
Profit 100(P1) 280(P2) 120(P3) 120(P4)
Weight 10 (W1) 40(W2) 20(W3) 24(W4)
Pi/ Wi 10 7 6 5
Consider the knapsack capacity W=60

Job Scheduling With Deadlines:


 Greedy Method is applied for this problem.
 Initially we are given a set of n jobs.
 Associated with job i is an integer deadline di >=0 and a profit pi >0.
 For any job i the profit pi is earned iff the job is completed by its deadline.

Conditions:
 To complete a job, one has to process the job on a machine for one unit of time.
 Only one machine is available for processing jobs.
 A feasible solution for this problem is a subset J of jobs such that each job in this subset can be
completed by its deadline.
 An optimal solution is a feasible solution with maximum value.
5
Algorithm for Job Scheduling:

Algorithm for Job Scheduling with Deadlines:

6
Example1:
Let number of jobs= n=3
(p1, p2, p3, p4)= (100, 10, 15, 27) deadline=1 job should be done on first day
(d1, d2, d3, d4)= (2, 1, 2,1) deadline=2 job should be done on first day or
second day Here maximum deadline=2 means only two jobs can be done per day & no parallel
execution of jobs is done.
Feasibl Processin Value Explanation
e g
Solutio Sequenc
n e
(1, 2) 2,1 110 2’s deadline <1’s deadline
(1, 3) 1,3 or 3,1 115 1’s deadline = 3’s deadline
(1,4) 4,1 127 4’s deadline <1’s deadline
(Maximum Profit)
(2, 3) 2, 3 25 2’s deadline < 3’s deadline
(2,4) Impossible because of parallel Both are having deadline=1
execution
(3, 4) 4, 3 42 4’s deadline <3’s deadline
(1) 1 100
(2) 2 10
(3) 3 15
(4) 4 27
Example2:

Multistage graphs:

 Let s is a source vertex & t is sink(destination) vertex.


 Let c (i , j)= cost of edges of (i , j).
 The cost of path from s to t is sum of cost of the edges on the path.
 The multi stage graph problem is to find minimum cost path from s to t.
 Two approaches in multi stage graph:
1. Forward approach.

7
2. Backward approach.

Multistage graphs- Forward Approach:

8
Multistage graphs- Forward Approach- Example:

9
Multistage graphs- Backward Approach:

Multistage graphs- Backward Approach-Example:

1
.

Dynamic Programming- General Method:


 Dynamic programming is an algorithm design method that can be used when the solution to a problem
can be viewed as the result of a sequence of decisions.
 Both Greedy Method & Dynamic Programming solve a problem by breaking it down into several sub-
problems that can be solved recursively.
 Dynamic programming is a bottom-up technique that usually begins by solving the smaller sub-problems,
saving these results, and then reusing them to solve larger sub-problems until the solution to the original
problem is obtained.
 Whereas divide-and-conquer approach, which solves problems in a top-down method.

Example:

 In Dynamic Programming, an optimal sequence of decisions is obtained by making explicit appeal to The
Principle of Optimality.
 Principle of Optimality states that an optimal sequence of decisions has the property that whatever the
initial state and decisions are, the remaining decisions must constitute an optimal decision sequence with
regard to the state resulting from the first decision.
 Steps in Dynamic Programming:
1. Characterize the structure of optimal solution.
2. Recursively defines the value of optimal solution.
3. The optimal solution has to be constructed from information.
4. Compute an optimal solution from computed information.

2
Difference between Greedy Method & Dynamic Programming:

3
All-pairs shortest paths:

All-pairs shortest paths-Algorithm:

Example:

4
Single-Source Shortest Paths:
 Graphs can be used to represent the highway structure of a state or country with vertices representing
cities and edges representing sections of highway.
 The edges can then be assigned weights which may be either the distance between the two cities
connected by the edge or the average time to drive along that section of highway.

5
Single-Source Shortest Paths- Algorithm:

Example1:

For the example, to reach source to destination (1 to 7) we have shortest path with value 42.

6
Example2:

*****************

7
The traveling sales person problem:
 Let G=(V,E) be a directed graph with edge costs Cij.
 The variable Cij is defined such that
 Cij >0 for all i & j.
 Cij= ∞ if (i ,j ) € E.
 Let |V|= n and assume that n>1.
 A tour of G is a directed simple cycle that includes every vertex in V.
 The cost of a tour is the sum of the cost of the edges on the tour.
 The traveling sales person problem is to find a tour of minimum cost.
 Notations:
 g(i, s) = length of the shortest path starting at vertex i, going through all vertices in s and terminating at
vertex 1.
 g(1, V-{1}) = length of an optimal sales person tour.
 Principle of Optimality states that

Example:

8
9
Backtracking- General Method:
 The name Backtrack was first coined by D.H.Lehmer in 1950’s.
 It is a method of determining the correct solution to a problem by examining all the available paths.
 If a particular path leads to unsuccessful solution then its previous solution is examined in-order to final correct
solution.
 In many applications of the backtrack method, the desired solution is expressible as an n-tuple (x1, x2,.. xn) where
xi is chosen from some finite set Si. Often the problem to be solved calls for finding one vector that maximizes or
minimizes or satisfies a criterion function P(x1,x2… xn)
 In Brute force algorithm, we consider all feasible solutions for finding optimal solution.
 In Backtracking algorithm, it is having ability to yield same answer with far fewer than m trails.
 Many of the problems, we solve using backtracking require that all solutions satisfy the complex set of
constraints.
 Two types of Constraints
1. Explicit Constraints are the rules that restrict each xi to take on values only from a given set.
Eg: xi>=0 or Si= {all non negative real numbers}
Xi= 0 or 1 or Si= { 0 , 1}
2. Implicit Constraints are the rules that determine which of the tuples in the solution space I satisfy the
criterion functions. Thus Implicit Constraints describe the way in which the xi must relate to each other.

Some Important Definitions:

1
2
3
Recursive Backtracking Algorithm:

Iterative Backtracking Algorithm:

4
Applications of Backtracking:
 Backtracking method is applied to solve various problems like:
1. N Queens Problem
2. Sum of Subsets Problem
3. Graph Coloring
4. Hamiltonian Cycles
5. Knapsack Proble

5
N Queens Problem (8 Queens Problem)
 N Queens Problems means:
1. Place N Queens placed on N X N chess board.
2. No Two Queens are placed in same row or same column or diagonal.
3. No Two Queens attack to each other.

4-Queens Problem solution:

6
4- Queens Problem –state space tree:

7
N-Queens Problem- algorithm1: Placing a new queen in kth row & ith column.

N-Queens Problem- algorithm2: All solutions for N Queens Problem.

8
8-Queens Problem solution:

3.2.2 Sum of subset problem

9
Sum of Subsets Problem-Algorithm

10
Sum of Subsets Problem-Example

11
3.2.3 Graph coloring:
 Let G be a graph and m be a positive integer.
 It is to find whether that nodes of G can be colored in such a way that no two adjacent nodes have the same color
yet only m colors are used where m is a chromatic number.
 If d is degree of a given graph G, then it is colored with d+ 1 colors.
 Degree means number of edges connected to that node.

12
Graph coloring- m coloring algorithm

Graph coloring- state space tree

13
Graph coloring- generating color algorithm

Graph coloring- another example

14
15

You might also like