0% found this document useful (0 votes)
24 views123 pages

DAA Min

Uploaded by

fakeeee67
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views123 pages

DAA Min

Uploaded by

fakeeee67
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 123

DESIGN AND ANALYSIS OF ALGORITHM

Unit 1.1 Topics: Algorithm- Algorithm specification- Performance analysis

Algorithm:
➢ An algorithm is a finite set of instructions that accomplishes a particular task.
➢ In addition, all algorithms must satisfy the following criteria:

➢ An algorithm is composed of a finite set of steps, each of which may require one or more operations.
➢ Algorithms produce one or more outputs and have zero or more inputs that are externally supplied.
➢ Each operation must be definite, meaning that it must be perfectly clear what should be done.
➢ They terminate after a finite number of operations.
➢ Algorithms that are definite and effective are also called computational procedures.
➢ A program is the expression of an algorithm in a programming language.

What is an Algorithm?

1. How to devise algorithms- Creating an algorithm is an art which may never be fully automated.
2. How to validate algorithms- Once an algorithm is devised, it is necessary to show that it computes the
correct answer for all possible legal inputs. We refer to this process as algorithm validation.
3. How to analyze algorithm- Analysis of algorithms or performance analysis refers to the task of determining
how much computing time and storage an algorithm requires.
4. How to test a program – Testing a program consists of two phases: debugging and profiling.
➢ Debugging is the process of executing programs on sample data sets to determine whether faulty results
occur and, if so, to correct them.
➢ Profiling or performance measurement is the process of executing a correct program on data sets and
measuring the time and space it takes to compute the results.

Algorithm specification:

➢ We can describe an algorithm in many ways.


➢ We can use a natural language like English, although if we select this option we must make sure that the
resulting instructions are definite.
➢ Graphic representation called flowcharts are another possibility, but they work well only if the algorithm
is small and simple.

1
2
3
Recursive Algorithm:
➢ A recursive function is a function that is defined in terms of it-self.
➢ Similarly, an algorithm is said to be recursive if the same algorithm is invoked in the body.
➢ An algorithm that calls itself is direct recursive.
➢ Algorithm A is said to be indirect recursive if it calls another algorithm which in turn calls A.
➢ Example: Towers of Hanoi.

Towers of Hanoi:
➢ The disks were of decreasing size and were stacked on the tower in decreasing order of size bottom to
top.
➢ Objective: Move the disks from tower X to tower Y using tower Z for intermediate storage.
➢ Rules:
1. As the disks are very heavy, they can be moved only one at a time.
2. No disk is on top of a smaller disk.

4
Towers of Hanoi- Recursive Algorithm

Towers of Hanoi- Algorithm Analysis & Recursive Calls

Towers of Hanoi- Solution For 3 Disks:


➢ For n=3 disks we have totally seven moves:
1. X-> Y
2. X-> Z
3. Y-> Z
4. X-> Y
5. Z-> X
6. Z->Y
7. X->Y
➢ For n disks, Total Number of Moves= 2n-1
➢ For 3 disks, Total Number of Moves= 23-1= 7 moves
➢ For 4 disks, Total Number of Moves= 24-1= 15 moves

5
Figure: Pole1 is tower X Pole2 is tower Y & Pole3 is tower Z

Performance Analysis:
➢ The space complexity of an algorithm is the amount of memory it needs to run to completion.
➢ The time complexity of an algorithm is the amount of computer time it needs to run to completion.

Space Complexity:

Example1: Finding result of given expression with fixed values a, b, c

Sp= 0 since no instance characteristics & S(P)= c only.

6
Time Complexity:

➢ The time T(P)taken by a program P is the sum of the compile time and the run (or execution)time.
➢ The compile time does not depend on the instance characteristics.
➢ Time Complexity can be calculated with two methods:
1. Recurrence Relations.
2. Step count Method
➢ We may assume that a compiled program will be run several times without recompilation.

Example- Recurrence Relations

7
Example- Step Count Method:

Asymptotic Notations:
➢ Asymptotic analysis of algorithms is used to compare relative performance.
➢ Time complexity of an algorithm concerns determining an expression of the number of primitive
operations needed as a function of the problem size.
➢ Asymptotic analysis makes use of the following:
1. Big Oh Notation.
2. Big Omega Notation.
3. Big Theta Notation.
4. Little Oh Notation.
5. Little Omega Notation.
Big-Oh Notation:
➢ The big-Oh notation gives an upper bound on the growth rate of a function.
➢ Definition:

8
Example:
Let us consider f(n)= 3n+2 g(n)= n c= 4 and the big Oh notation relation is f(n)<=c* g(n)
3n+2 <= 4n for n>=3→ It satisfies the relation
n=3 → 3.3+2<= 4.3 → 11<=12 → True
n=4 → 3.4+2<= 4.4 → 12<=16 → True

Big-Omega Notation:
➢ The big-Omega notation gives an lower bound on the growth rate of a function.
➢ Definition:

9
Example:
Let us consider f(n)= 3n+2 g(n)= n c= 3 and the big Omega notation relation is f(n)>=c* g(n)
3n+2 >= 3n for n>=2→ It satisfies the relation
n=2 → 3.2+2>= 3.2 → 8>= 6→ True
n=3 → 3.3+2<= 4.2 → 9>=8 → True

Big-Theta Notation:
➢ Definition:

10
*************************

11
DESIGN AND ANALYSIS OF ALGORITHM

Unit 1.2 Topics: General method- Binary Search- Finding the maximum and minimum- Merge sort- Quick Sort-
Selection- Strassen's matrix multiplication

Divide-And-Conquer- General Method:


 Divide-and-Conquer Strategy breaks (divides) the given problem into sub problems, solve each sub
problems independently and finally conquer (combine) all sub problems solutions into whole solution.
 Divide-and-Conquer Strategy suggests splitting the n inputs into k distinct subsets, 1< k < n, yielding k sub-
problems.
 If the sub-problems are still relatively large, then the divide-and-conquer strategy can possibly be
reapplied.
 Often the sub-problems resulting from a divide-and conquer design are of the same type as the original
problem.

Computing Time:

1
Binary Search:
 A binary search algorithm is a technique for finding a particular value in a sorted list.
 Divide-and-conquer can be used to solve this problem.
 Any given problem P gets divided into one new sub-problem. This division takes only O(1) time.
 After a comparison the instance remaining to be solved by using this divide-and-conquer scheme again.
 If the element is found in the list  successful search
 If the element is not found in the list  unsuccessful search
 The Time Complexity of Binary Search is:

2
Recursive Binary Search- Algorithm

Iterative Binary Search- Algorithm

3
Binary Search Time Complexity

4
Finding the Maximum And Minimum:
 The divide and- conquer technique is to find the maximum and minimum items in a set of n elements in
the given list.

5
Finding the Maximum And Minimum- Divide & Conquer:
 If the list contains only one element, then
Maximum=Minimum= a[1] element only.
 If the list contains two elements, then compare these two elements & find maximum and minimum.
 If the list contains more than two elements, then divide the given list into sub lists based on middle
value.
 Recursively perform this process until minimum & maximum value is found in the given list.
 Time Complexity = O(n)

Algorithm:

6
7
Merge sort:
 In Merge Sort, the elements are to be sorted in non decreasing order.
 Given a sequence of n elements(also called keys) a[l],,.a[n]the general idea is to imagine them split into
two sets a[l]….a[n/2]and a[[n/2]+1]….a[n].
 Each set is individually sorted, and the resulting sorted sequences are merged to produce a single sorted
sequence of n elements.

Merge sort Algorithm:

8
Merge- Algorithm:

Merge sort- Example

9
Merge sort- Time Complexity

10
Quick Sort:
 In quick sort, the division into two sub arrays is made so that the sorted sub arrays do not need to be
merged later.
 Three Steps involved here is:
1. Partitioning given array into sub arrays.
2. Interchanging two elements in the array.
3. Searching the input element.
Time Complexity = O(n*log n)

Quick Sort- Partitioning & Interchanging Algorithm:

11
Quick Sort Algorithm:

Quick sort- Example:

12
13
Quick sort- Time Complexity:

14
Selection:
 Selection is used to find kth smallest element and place kth position in the given array of n elements say
a[1:n].
 j is the position in which k element is there.
 Here we are partitioning the given array into three parts (say k is small element):
1. Sub array which contains 1 to j-1 elements.
2. if k=j => element is found in jth position.
3. Sub array which contains (n-j) elements.
 This partitioning process continues until we found kth smallest element.
 Time Complexity= O(n2)

Selection- Partitioning & Interchanging Algorithm:

15
Selection- Algorithm

Selection- Example

16
17
Strassen's matrix multiplication:

18
Strassen's matrix multiplication- Formula:

Strassen's matrix multiplication- Example:

19
20
Strassen's matrix multiplication- Time Complexity:

*****************

21
DESIGN AND ANALYSIS OF ALGORITHM – GREEDY METHOD

Unit 2.1 Topics: General method- Knapsack Problem- Job scheduling with deadlines- Minimum cost spanning
trees- Optimal storage on tapes- Single source shortest path

Greedy Method-General Method:


 It is straightforward design technique and applied to a wide variety of problems.
 Most of these problems have n inputs and require us to obtain a subset that satisfies some constraints.
 Any subset that satisfies those constraints is called a feasible solution.
 We need to find a feasible solution that either maximizes or minimizes a given objective function. A Feasible
solution that does this is called an optimal solution.
 The greedy method suggests that one can devise an algorithm that works in stages, considering one input at a time.
 A Greedy technique that will result in algorithm those general sub optimal solutions is called subset paradigm.

Greedy Algorithm:

 The function Select selects an input from a[ ] and removes it.


 The selected input's value is assigned to x.
 Feasible is a Boolean-valued function that determines whether x can be included into the solution vector.
 The function Union combines x with the solution and updates the objective function.
 Once a particular problem is chosen and the functions Select, Feasible and Union are properly implemented.

Knapsack Problem:

 The greedy method is applied to solve the knapsack problem.


 We are given n objects and a knapsack or bag.
 Object i has a weight wi and the knapsack has a capacity m.
 If a fraction xi, 0 <xi < 1, of object i is placed into the knapsack, then a profit of pi * xi is earned.
 The objective is to obtain a filling of the knapsack that maximizes the total profit earned.
 Since the knapsack capacity is m, we require the total weight of all chosen objects to be at most m.

1
Formally the problem can be stated as:

2
Knapsack Problem-Example:

3
Types of Knapsack Problem:

4
Fractional Knapsack- Example Problem:

Item A B C D
Profit 280 100 120 120
Weight 40 10 20 24
Pi/ Wi 7 10 6 5
Arranging the above the tables with descending order of Pi/ Wi

Item B A C D
Profit 100(P1) 280(P2) 120(P3) 120(P4)
Weight 10 (W1) 40(W2) 20(W3) 24(W4)
Pi/ Wi 10 7 6 5
Consider the knapsack capacity W=60

Job Scheduling With Deadlines:


 Greedy Method is applied for this problem.
 Initially we are given a set of n jobs.
 Associated with job i is an integer deadline di >=0 and a profit pi >0.
 For any job i the profit pi is earned iff the job is completed by its deadline.

Conditions:
 To complete a job, one has to process the job on a machine for one unit of time.
 Only one machine is available for processing jobs.
 A feasible solution for this problem is a subset J of jobs such that each job in this subset can be completed by its
deadline.
 An optimal solution is a feasible solution with maximum value.

5
Algorithm for Job Scheduling:

Algorithm for Job Scheduling with Deadlines:

6
Example1:
Let number of jobs= n=3
(p1, p2, p3, p4)= (100, 10, 15, 27) deadline=1 job should be done on first day
(d1, d2, d3, d4)= (2, 1, 2,1) deadline=2 job should be done on first day or second day
Here maximum deadline=2 means only two jobs can be done per day & no parallel execution of jobs is done.
Feasible Processing Value Explanation
Solution Sequence
(1, 2) 2,1 110 2’s deadline <1’s deadline
(1, 3) 1,3 or 3,1 115 1’s deadline = 3’s deadline
(1,4) 4,1 127 4’s deadline <1’s deadline
(Maximum Profit)
(2, 3) 2, 3 25 2’s deadline < 3’s deadline
(2,4) Impossible because of parallel execution Both are having deadline=1
(3, 4) 4, 3 42 4’s deadline <3’s deadline
(1) 1 100
(2) 2 10
(3) 3 15
(4) 4 27
Example2:

Minimum Cost Spanning Trees:


 A Spanning Tree of a graph is a tree that has all the vertices of the graph is connected by some edges.
 A Graph can have one or more spanning trees.
 If a graph contains n vertices, then spanning trees contains (n-1) edges.
 A Minimum Cost Spanning Tree (MST) is a spanning tree that has minimum weight than all other spanning trees of
the graph.
 A Spanning Tree does not contain cycles.
 Two Algorithms are used to find Minimum Cost Spanning Tree:
1. Prim’s Algorithm.
2. Kruskal Algorithm.

7
1. Prim’s Algorithm:
In this algorithm we choose a neighboring or adjacent vertex for finding minimum cost spanning tree.

Example:

8
9
2. Kruskal Algorithm:
In this algorithm we list out costs between vertices in ascending order & add one by one vertex to spanning tree which
doesn’t forms any cycles.

Example:

10
11
Optimal storage on tapes:

Example:
Let number of inputs=3
(l1, l2, l3)= (5, 10, 3)
For n number of jobs we have n! possible orderings.
For 3 jobs we have 3! = 6 possible orderings.
Ordering I d(I) MRT
1, 2, 3 5 +(5+10)+(5+10+3) 38
1, 3, 2 5 +(5+3)+(5+3+10) 31
2, 1, 3 10+(10+5)+(10+5+3) 43
2, 3, 1 10+(10+3)+(10+3+5) 41
3, 1 ,2 3+(3+5)+(3+5+10) 29
3, 2,1 3+(3+10)+(3+10+5) 34
The optimal ordering is 3, 1, 2 is having minimum MRT value as 29.
Single-Source Shortest Paths:
 Graphs can be used to represent the highway structure of a state or country with vertices representing cities and
edges representing sections of highway.
 The edges can then be assigned weights which may be either the distance between the two cities connected by the
edge or the average time to drive along that section of highway.

12
Single-Source Shortest Paths- Algorithm:

Example1:

For the example, to reach source to destination (1 to 7) we have shortest path with value 42.

13
Example2:

*****************

14
DESIGN AND ANALYSIS OF ALGORITHM – DYNAMIC PROGRAMMING

Unit 2.2 Topics: General Method, Multistage graphs, All-pairs shortest paths, Optimal binary search trees, 0/1
knapsack, The traveling sales person problem.

Dynamic Programming- General Method:


 Dynamic programming is an algorithm design method that can be used when the solution to a problem
can be viewed as the result of a sequence of decisions.
 Both Greedy Method & Dynamic Programming solve a problem by breaking it down into several sub-
problems that can be solved recursively.
 Dynamic programming is a bottom-up technique that usually begins by solving the smaller sub-problems,
saving these results, and then reusing them to solve larger sub-problems until the solution to the original
problem is obtained.
 Whereas divide-and-conquer approach, which solves problems in a top-down method.

Example:

 In Dynamic Programming, an optimal sequence of decisions is obtained by making explicit appeal to The
Principle of Optimality.
 Principle of Optimality states that an optimal sequence of decisions has the property that whatever the
initial state and decisions are, the remaining decisions must constitute an optimal decision sequence with
regard to the state resulting from the first decision.
 Steps in Dynamic Programming:
1. Characterize the structure of optimal solution.
2. Recursively defines the value of optimal solution.
3. The optimal solution has to be constructed from information.
4. Compute an optimal solution from computed information.

1
Difference between Greedy Method & Dynamic Programming:

Multistage graphs:

 Let s is a source vertex & t is sink(destination) vertex.


 Let c (i , j)= cost of edges of (i , j).
 The cost of path from s to t is sum of cost of the edges on the path.
 The multi stage graph problem is to find minimum cost path from s to t.
 Two approaches in multi stage graph:
1. Forward approach.
2. Backward approach.

Multistage graphs- Forward Approach:

2
Multistage graphs- Forward Approach- Example:

3
Multistage graphs- Backward Approach:

Multistage graphs- Backward Approach-Example:

4
All-pairs shortest paths:

All-pairs shortest paths-Algorithm:

Example:

5
Optimal binary search trees:
A binary search tree is a tree where the key values are ordered that all the keys in the left sub tree are less
than the keys in the node, and all the keys in the right sub tree are greater. Clearly
LST<=ROOT<=RST
LST= Left Sub Tree & RST= Right Sub Tree

6
Optimal binary search trees- Algorithm:

7
8
9
10
0/1 knapsack:
 Knapsack is a bag which has a capacity M.
 In fractional knapsack, we place the items one by one by considering Pi/ Wi.
 In 0/1 Knapsack, we cannot consider any fractions.
 In Dynamic Programming, we consider knapsack problem for placing maximum elements with
maximum profit & weight that does not exceed knapsack capacity.
 0 means we cannot consider that item.
 1 means we consider that item to placed in knapsack.
 The solution for Dynamic Programming is:

0/1 knapsack- Algorithm:

0/1 knapsack- Example:

11
12
13
The traveling sales person problem:
 Let G=(V,E) be a directed graph with edge costs Cij.
 The variable Cij is defined such that
 Cij >0 for all i & j.
 Cij= ∞ if (i ,j ) € E.
 Let |V|= n and assume that n>1.
 A tour of G is a directed simple cycle that includes every vertex in V.
 The cost of a tour is the sum of the cost of the edges on the tour.
 The traveling sales person problem is to find a tour of minimum cost.
 Notations:
 g(i, s) = length of the shortest path starting at vertex i, going through all vertices in s and terminating at
vertex 1.
 g(1, V-{1}) = length of an optimal sales person tour.
 Principle of Optimality states that

Example:

14
15
*******************

16
DESIGN AND ANALYSIS OF ALGORITHM

Unit 3.1 Topics: Basic Traversal And Search Techniques- Techniques for Binary Trees- Techniques for Graphs-
Connected Components and Spanning Trees- Bi-connected components and DFS

3.1.1 Techniques for Binary Trees:


 In Normal Tree, any node can have any number of children.
 Binary Tree is a special tree in which every node can have a maximum of two children.
 In Binary Tree, each node can have 0 or 1 or 2 children but not more than 2 children.
 Displaying or Visiting order of nodes in binary trees is called Binary Tree Traversal.
 3 Types of Binary Tree Traversals:
1. In-order (LVR) Traversal- order: left child, root node, right child.
2. Pre-order (VLR) Traversal -order: root node, left child, right child.
3. Post-order (LRV) Traversal-order: left child, right child, root node.

Algorithm for In-order traversal

Algorithm for Pre-order traversal

1
Algorithm for Post-order traversal

Example for Binary Tree Traversal

3.1.2 Techniques for Graphs


 A Graph G = (V, E) is defined such that this path starts at vertex v and ends at vertex u.
 We describe two search methods for this:
1. Breadth First Search and Traversal.
2. Depth First Search and Traversal.

2
Breadth First Search and Traversal (level by level traversing)
 In breadth first search we start at a vertex v and mark it as having been reached (visited).
 The vertex v is at this time said to be unexplored.
 A vertex is said to have been explored by an algorithm when the algorithm has visited all vertices adjacent from it.
 The newly visited vertices haven't been explored and are put onto the end of a list of unexplored vertices.
 The first vertex on this list is the next to be explored.
 Exploration continues until no unexplored vertex is left. The list of unexplored vertices operates as a queue and can
be represented using any of the standard queue representation.

Algorithm for BFS

Algorithm for Breadth first graph traversal

3
Example for BFS

BFS Order: 1, 2, 3, 4, 5, 6, 7, 8

Applying BFS

Depth First Search and Traversal


 A depth first search of a graph differs from a breadth first search in that the exploration of a vertex v is suspended
as soon as a new vertex is reached.
 At this time the exploration of the new vertex u begins.
 When this new vertex has been explored, the exploration of v continues. The search terminates when all reached
vertices have been fully explored.

Algorithm for DFS

Example for DFS

DFS Order: 1, 2, 4, 8, 5, 6, 3, 7

Applying DFS

4
3.1.3 Connected Components and Spanning Trees

 A graph is said to be connected if at least one path exists between every pair of vertices in the graph.
 Two vertices are connected component if there exists a path between them.
 A directed graph is said to be strongly connected if every two nodes in the graph are reachable from each other.

Spanning Trees
 A spanning tree of an undirected graph of n nodes is a set of n-1 edges that connects all nodes.
 A graph may have many spanning trees.
 For finding minimum cost spanning trees we have two algorithms
1. Prim’s Algorithm
2. Kruskal Algorithm

3.1.4 Bi-connected components and DFS


 A vertex v in a connected graph G is an articulation point if and only if the deletion of vertex v together with all
edges incident to v disconnects the graph into two or more non empty components.
 A graph G is bi-connected if and only if it contains no articulation points.

Node 2 is articulation point of Graph G

5
Bi Connected Component

Example:

DFS
Depth First Spanning Trees are used to identify articulation points and bi connected components.

**************

6
DESIGN AND ANALYSIS OF ALGORITHM

Unit 3.2 Topics: Backtracking- General Method- 8 Queens Problem- Sum of subset problem- Graph coloring-
Hamiltonian Cycles- Knapsack Problem

3.2.1 Backtracking- General Method:


 The name Backtrack was first coined by D.H.Lehmer in 1950’s.
 It is a method of determining the correct solution to a problem by examining all the available paths.
 If a particular path leads to unsuccessful solution then its previous solution is examined in-order to final correct
solution.
 In many applications of the backtrack method, the desired solution is expressible as an n-tuple (x1, x2,.. xn) where
xi is chosen from some finite set Si. Often the problem to be solved calls for finding one vector that maximizes or
minimizes or satisfies a criterion function P(x1,x2… xn)
 In Brute force algorithm, we consider all feasible solutions for finding optimal solution.
 In Backtracking algorithm, it is having ability to yield same answer with far fewer than m trails.
 Many of the problems, we solve using backtracking require that all solutions satisfy the complex set of
constraints.
 Two types of Constraints
1. Explicit Constraints are the rules that restrict each xi to take on values only from a given set.
Eg: xi>=0 or Si= {all non negative real numbers}
Xi= 0 or 1 or Si= { 0 , 1}
2. Implicit Constraints are the rules that determine which of the tuples in the solution space I satisfy the
criterion functions. Thus Implicit Constraints describe the way in which the xi must relate to each other.

Some Important Definitions:

1
2
Recursive Backtracking Algorithm:

Iterative Backtracking Algorithm:

3
Applications of Backtracking:
 Backtracking method is applied to solve various problems like:
1. N Queens Problem
2. Sum of Subsets Problem
3. Graph Coloring
4. Hamiltonian Cycles
5. Knapsack Problem

3.2.2 N Queens Problem (8 Queens Problem)


 N Queens Problems means:
1. Place N Queens placed on N X N chess board.
2. No Two Queens are placed in same row or same column or diagonal.
3. No Two Queens attack to each other.

4-Queens Problem solution:

4
4-Queens Problem –state space tree:

5
N-Queens Problem- algorithm1: Placing a new queen in kth row & ith column.

N-Queens Problem- algorithm2: All solutions for N Queens Problem.

6
8-Queens Problem solution:

3.2.3 Sum of subset problem

7
Sum of Subsets Problem-Algorithm

8
Sum of Subsets Problem-Example

9
3.2.4 Graph coloring:
 Let G be a graph and m be a positive integer.
 It is to find whether that nodes of G can be colored in such a way that no two adjacent nodes have the same color
yet only m colors are used where m is a chromatic number.
 If d is degree of a given graph G, then it is colored with d+ 1 colors.
 Degree means number of edges connected to that node.

10
Graph coloring- m coloring algorithm

Graph coloring- state space tree

11
Graph coloring- generating color algorithm

Graph coloring- another example

12
3.2.5 Hamiltonian Cycles

Hamiltonian Cycles-Generating a next vertex Algorithm

13
Hamiltonian Cycles-Finding all Hamiltonian Cycles Algorithm

3.2.6 Knapsack Problem

14
Knapsack Problem- Bounding Function Algorithm

Backtracking Knapsack Problem- Algorithm

15
Knapsack Problem- Example

16
17
****************

18
UNIT-4 DESIGN AND ANALYSIS OF ALGORITHM

Branch and Bound: The method, Travelling salesperson, 0/1 Knapsack problem.
Lower Bound Theory: Comparison trees, Lower bounds through reductions – Multiplying triangular matrices,
inverting a lower triangular matrix, computing the transitive closure

4.1 Branch and Bound (B & B)- The method:


 B & B is a general algorithmic method for finding optimal solutions of various problems.
 In B & B, a state space tree is built and all the children of E-nodes are generated before any other node become a
live node.
 E node is a live node that can be expanded to generate its children node.
 Live node is a node that can be expanded without generating its children node.
 B & B is used only for optimization problem.
 B & B needs two additional values when compared to backtracking.
1. A bound value of objective function for every node of state space tree.
2. Value of best solution is compared to node’s bound.
 If node’s bound is better than best solution  node is terminated.
 Lower bound is for minimization problems.
 Upper bound is for maximization problems.
 In Branching, we define tree structure from set of candidates in a recursive manner.
 In Bounding, we calculate lower bound & upper bound of each node in the tree.
 Lower bound > Upper bound  first node is discarded from the search  Pruning.
 B & B is based on advanced BFS which is done with priority queue instead of traditional list. That means highest
priority element is always on first position.
 Bounding functions are useful because it doesn’t allow to generate sub tree that has no answer nodes.
 3 types of search strategies:
1. FIFO (First- In- First- Out) Search or BFS.
2. LIFO (Last- In- First- Out) Search or DFS.
3. Least Count (LC) Search.

Difference between Backtracking & Branch and Bound:

1
4.1.1. FIFO (First- In- First- Out) Search or BFS:

4.1.2. LIFO (Last- In- First- Out) Search or DFS:

2
4.1.3. Least Count (LC) Search:

3
4.2 Travelling Sales Person using B & B

4
TSP Example:

5
6
7
8
9
10
11
12
4.3 0/1 Knapsack problem using B & B

13
14
15
16
4.4 Lower Bound Theory - Comparison trees

17
18
19
20
4.5 Multiplying triangular matrices

Triangular Matrix Definition:

21
4.6 Inverting a Lower Triangular Matrix:

22
23
4.7 Computing the Transitive Closure

24
*******************************

25
UNIT-5 DESIGN AND ANALYSIS OF ALGORITHM (P, NP, NP-COMPLETE, NP-HARD PROBLEMS)

Introduction to Problems:

Types of Algorithms

 Two types of Algorithms:


1. Deterministic Algorithm: It has a property that result of every operation is uniquely defined.
2. Non Deterministic Algorithm: It terminates unsuccessfully if and only if there exists no set of choices leading to a
success signal.
 To specify such algorithms, we introduce 3 functions:

1
2
3
Cook’s Theorem:

4
5
Satifiability problem

6
3 CNF Satifiability problem

7
8
9
10
11
**********************

12

You might also like