0% found this document useful (0 votes)
433 views

DAA Assignment (Module4)

The document discusses combinatorial optimization problems and algorithms for solving them. It defines combinatorial optimization problems as having either a cost function to minimize or a benefit function to maximize. Examples given include the 8 queens problem, knapsack problem, and travelling salesperson problem. The document then discusses strategies for solving these problems, including backtracking search, branch and bound, dynamic programming, and greedy algorithms. It provides more detailed explanations of backtracking search and branch and bound algorithms.

Uploaded by

SaGen SoRen
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
433 views

DAA Assignment (Module4)

The document discusses combinatorial optimization problems and algorithms for solving them. It defines combinatorial optimization problems as having either a cost function to minimize or a benefit function to maximize. Examples given include the 8 queens problem, knapsack problem, and travelling salesperson problem. The document then discusses strategies for solving these problems, including backtracking search, branch and bound, dynamic programming, and greedy algorithms. It provides more detailed explanations of backtracking search and branch and bound algorithms.

Uploaded by

SaGen SoRen
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

74

Module - 4

Combinatorial Optimization Problem


Combinatorial optimization problem have following property:
It has a cost function which is to be minimized.
OR
It has a benefit function which is to be maximized.
Example 1: 8 queen problem:
Place 8 queen on 8x8 chess board such that no two queens are in the same row, same column
or in same diagonal. This problem contains only constraint. It has no objective function.
Example 2: Knapsack problem
Here the benefit function is to be maximized.
Example 3: Travelling salesperson problem
(i) Input: Weighted Graph
(ii) Output: Tour that visits every vertex exactly once
(iii) Objective: Minimize the total distance (or weight) in tour
(iv) Applications: Robotics, computational geometry

Strategies for solving combinatorial optimization problem


1. Backtrack search: This is a general strategy which can be applied to any problem.
2. Branch and Bound: This is a restricted strategy but more efficient than backtrack strategy.
3. Dynamic programming: This is efficient than Branch and Bound strategy.
Here we have to do more analytical work.
4. Greedy strategy

Backtrack search (Backtracking)


 Backtracking is a very general method.
 It generates all possible solution. This is called Search space (or Solution space).
 It takes long time.
 Search tree explore each level until leaf node.
Steps:
1. Check if conditions (or constraints) are satisfied.
2. If satisfied then evaluate the cost.
3. Update the current cost when it is better than previous.

How to program:
1. Representation of objects
2. Procedure to fill the values of object in all possible ways.
3. Recurse (or Repeat)
4. Procedure to check solution at each leaf node.
5. Procedure to evaluate cost.
6. Procedure to update the previous value. (Backtrack)
75
Example1: N Queen Problem. Consider N = 4. That means 4 Queen problem.

Example 2: TSP (Travelling Salesperson Problem)

Example 3: Knapsack problem (n = 4)


76
Branch and Bound
Question: Do we need to search all possible solution?
Answer: No
Pruning: It is a technique that reduces the size of search tree.
It removes braches that are not necessary.

Example: TSP (Travelling Salesman Problem)

Difference between Backtracking and Branch & Bound

Backtracking Branch & Bound


1. Finds all possible solutions. 1. Does not find all possible solutions.
2. Cost is calculated at leaf node. 2. Cost is calculated at every node.
3. We may do more work because more nodes 3. There may be pruning which reduces
of the tree are visited. number of nodes to be visited.

Issues in Backtracking and Brach & Bound


1. How to organise the search space or solution space (which object to choose first)
2. At a given node, which edge to explore (or visit) first.
77
STRING MATCHING

 It is also called pattern matching


 Two strings are given: Text and Pattern.
It finds the location of pattern in the Text.
 Example:
Text : computerscienceandengineering
Pattern: science
Applications
 Searching keywords in a file
 Searching engines (like Google and Yahoo)
 Database searching

String Matching Algorithms: 1. Naive string matching algorithm


2. Rabin-Karp algorithm
3. Knuth Morris Pratt algorithm

1. Naive string matching algorithm


 It is also called Brute Force algorithm.
 compares the pattern with text, one character at a time, from left to right
 When first mismatch occurs, we move to the next character of text.
Comparison start again from the first character of pattern.

1 2 3 4 5 6 7 8 9
c a c b c a c a b
S=3

1 2 3 4
b c a c

Here, location of pattern is found with shift s=3. i.e. After 3 shifting we find the location of pattern.

Algorithm NAÏVE (Text, Pattern)


n = length[Text]
m = length[Pattern]
for s = 1 to n – m
if Pattern[1 . . . m] = = Text[s + 1. . . s+m] // if the condition is true for m times
print " location of Pattern is found with shift s "

Time Complexity = O(n m)


78
2. Rabin-Karp Algorithm
 Consider decimal number system. Each character is a digit {0, 1, 2, . . . , 9} in radix d=10.
The character string 31415 corresponds to the decimal number 31,415.
 Pattern is viewed as a sliding window. This window slides along the text to find a matching window with
same modulo.
 If modulo is same but pattern is not matched then is called a spurious hit.
Example: Working modulo q = 13, how many spurious hits does the Rabin-Karp matcher encounter in the
Text = 2359023141526739921 for the Pattern = 31415?

Solution: remainder = P % q = 31415 % 13 = 7. When remainder is 7 then match.

Steps Next Term Remainder Status Remark


(ts+1) (term % 13)
1 23590 8 X
2 35902 9 X
3 59023 3 X
4 90231 11 X
5 02314 0 X
6 23141 1 X
7 31415 7 Match Valid Match
8 14152 8 X
9 41526 4 X
10 15267 5 X
11 52673 10 X
12 26739 11 X
13 67399 7 Match Invalid Match or Spurious Hit
14 73992 9 X
15 39921 11 X
Algorithm Rabin-Karp (T, P, d, q) // T = Text , P = Pattern , d = radix , q = working modulo
n = length[T]
m = length[P]
h = dm–1 mod q
p=0
t0 = 0
for i = 1 to m // Preprocessing.
p = (d ∙ p + P [i]) mod q
t0 = (d ∙ t0 + T [i]) mod q
for s = 0 to n – m // Matching.
if ts = p
if P [1. . . m] = T [s + 1. . . s + m]
print "location of pattern is found with shift s”
if s < n – m
t S + 1 = (d ∙ (t S – T[s + 1] ∙ h) + T[s + m + 1]) mod q

Analysis of Rabin-Karp Algorithm:


Time of calculating n different windows in array T[1…n] = O(n).
Time for matching the pattern = O(m).
So, total Time Complexity = O ( m+n).
79
3. Knuth-Morris-Pratt (KMP) Algorithm
• Discovered by Donald Knuth , Vaughan Pratt and James H. Morris.
• This algorithm is developed by analysis of Naive algorithm.
• It keeps the information that is wasted in Naive algorithm.
• Efficient because it minimizes the total number of comparisons.
• Compares the pattern with text from left to right but shifts the pattern intelligently (some time more than
one position)
• Calculating the prefix-function of pattern, avoids repeated comparisons.
• Time complexity = O (m + n).

Prefix function
Prefix function is applied on Pattern.
For each character of pattern, we find a value prefix[ j ]
This value indicates how much of the last comparison can be reused.
This value decides the next value of shift during comparison.
Example: a b a Length of the Pattern = 3

Prefix are: a, ab , aba


Suffix are: a, ba , aba
Prefix[j] = Length of largest prefix of Pattern (P) that matches the suffix.

Algorithm Prefix (P)


Prefix [1] = 1
i=2 , j= 1
while ( i <= m) // m = no. of character in pattern P
{
if p [ j ] = = p [ i ]
{ Example:
Prefix [ i ] = j+1
j 1 2 3 4 5 6
i = i+1
j = j+1 P[j] a b a c a b
}
else if ( j > 1) Prefix[j] 1 1 2 1 2 3
{
j = Prefix [ j-1 ]
} For j = 1, 2, 3, 4, 5, 6 in above algorithm, we find Prefix [ j ]
else
{
Prefix [ i ] = 1
i = i+1
}
}
80
Example:
• Here 5 characters of pattern have
matched successfully for shift S.
• 6th character Pattern fails to match the
corresponding Text character.

• From prefix function, we find that s+1 is


invalid. So, S+1 shift is avoided.
Hence, the next shift S+2
• In shift S+2, first 3 characters need not be
matched again.

KMP Algorithm ( T, P ) // T = Text, P = Pattern


Prefix (P) // Calling Prefix function
i=1
j=1
While ( i <= n) // n = length of text T
{
if P[ j ] = = T[ i ] // matching
i = i+1
j = j+1
if ( j = = m) then print “PATTERN IS FOUND”
else if ( j > 1) // fails after some match
j = Prefix[ j -1 ]
else // no match
i = i+1
}
81
Polynomial (P): These problems are solvable in polynomial time.
Non-deterministic Polynomial (NP): These problems are not solvable in polynomial time.
But these problems are verifiable in polynomial time (That means, we can verify that a given solution is
correct or incorrect in polynomial time)
Polynomial time means time-complexity is a Polynomial function i.e. O(nm) . Example O(n2), O(n3) etc.
Non-polynomial means time complexity is a Exponential function. Example: O(2n), O(3n) etc.
NP Complete (NPC): A problem is in class NPC if
(i) It is NP
(ii) It is as hard as any problem in NP (NP Hard problem)
Some NP Complete problems are listed below:
 TSP (Traveling Salesman Problem): A salesman wants to visit some cities and come back to the starting-
city by travelling minimum distance.
 Hamiltonian cycle: It is a cycle in graph which contains each vertex exactly once.
 3-CNF Satisfiability: (x1 V ¬x2 V x3) ᴧ (x1 V x2 V ¬x3) is a boolean formula having 3 variables.
The above formula is satisfiable for x1 =1, x2 = 0, x3 = 1
NOTE:
1. No polynomial time algorithm has yet been discovered for an NPC problem. No one has proved that no
polynomial time algorithm can exist for any one of them.
2. If any NPC problem can be solved in polynomial time, then every NPC problem can be solved in
polynomial time.
3. We know that, problems of class P is solvable in polynomial time. So, obviously they can be verified in
polynomial time. In that sense, we say P  NP

Polynomial time verification: We will discuss some verification algorithms that verify in languages.
Example: Hamiltonian cycle
Let n = number of vertices in graph.
The solution-algorithm has to check n! combination of cycle.
So it can’t be solved in polynomial time.
But, a verification algorithm can verify a given solution as follows.
1. Verify each vertex of given solution belongs to graph. [ This will take O(n) time ]
2. Verify each edge of given solution belongs to graph. [ This will take O(n2) time ]
Total time of verification algorithm = O(n) + O(n2) = O(n2) ; it is a polynomial time.

Verification algorithm: We have to verify the solution of a problem such that A(x, y) = 1
where x = input , y = output
The language of a verification algorithm A is given below
L = { x  {0 , 1}* : there exist y  {0,1}* such that A(x, y) = 1 }

A language L belongs to P, if there exist a polynomial time algorithm to decide L.


A language L belongs to NP, if there exist a polynomial time algorithm to verify L.
CO-NP is the set of languages such that L  NP
If L  CO-NP then L  NP
Prove that P  CO-NP
L P  L  NP  L  CO-NP
82
Reducibility:
Reducibility means conversion or transformation. Reducibility is used to solve NPC Problem. Suppose we
don’t know how to solve a NPC problem. For this, we refer to a previously solved NPC problem and reduce
that solved problem to our problem. A problem Q can be reduced to another problem Q’.
Example: Suppose, we don’t know how to solve linear equation. But, we know the solution of quadratic
equation. We can reduce a linear equation ax + b = 0 to a quadratic equation 0•x2 + ax + b = 0
Definition: A language L1 is polynomial time reducible to a language L2 (written as L1 <p L2) if there exist a
polynomial time function f. f is called reduction function.
Algorithm that computes f is called Reduction algorithm.

L1 L2

Now, we can define NP completeness in terms of reducibility:


A language L1 is NP complete if
1. L1  NP reducible
2. L2 <P L1 (L2 is another NPC Problem whose solution is known)
If a language L1 satisfies property 2 only, then we say that L1 is NP hard.

Reduction of NPC problems


CIRCUIT-SAT

SAT

3CNF-SAT

CLIQUE SUBSET-SUM

VERTEX COVER

HAM-CYCLE

TSP

The above figure shows that HAM-CYCLE can be reduced to TSP, 3CNF-SAT can be reduced to SUBSET-
SUM etc.

SUBSET-SUM problem: Given a set S and a number t. We have to find possible number of subset of S.
Sum of subset-elements is equal to t.
Example: S = { 6, 3, 2, 7, 4, 5, 1 } t = 8 . The solutions are { 6, 2 } , { 3, 5 } , { 2, 5, 1 } , { 3, 4, 1 } etc.
83
Approximation algorithm:
Sometime a problem is NP complete but still we have to solve it anyhow. There are 3 techniques to solve
NPC problem.
1. If the input is very small, then algorithm with exponential time is satisfactory.
2. Solve those special cases of problem which take polynomial time.
3. Find a near optimal solution in polynomial time. These algorithms are called approximation algorithms.
Let n = input size, C = cost of solution given by approximation algorithm, C* = cost of optimal solution
 C C *
Approximation Ratio  max  , 

 C * C 
NOTE:
 Approximation ratio is inversely proportional to solution. Lesser is the approximation ratio, better is the
solution.
 One of the approximation algorithms produces optimal solution.

Q: Write an approximation algorithm for TSP


Algorithm TSP (G, C)
// G = Graph , C = Cost matrix
1. Select root vertex r of graph G
2. Find the minimum spanning tree T starting from r
3. Find the pre-order traversal L of T
4. Return Hamiltonian-cycle H that visits in the order L

Abstract Problem:
An abstract problem Q is a binary relation on I and S. where I = set of problem instances, S = set of problem
solutions
Example: Shortest path problem
Instance: <G, u, v> G = graph, u = source, v = destination
Solution: <u, v1, v2, . . . . , v>

Abstract decision Problem:


An abstract decision problem is a function which maps the instance set I to the solution set {0, 1}.
Example: Find shortest path from u to v having at most k edges
Instance: <G, u, v, k> is a instance of i
Path(i) = 1 , if the path is present
= 0 , otherwise

Optimization problem:
An abstract problem is optimization problem in which some value is minimized or maximized.

You might also like