DAA Min
DAA Min
Algorithm:
➢ An algorithm is a finite set of instructions that accomplishes a particular task.
➢ In addition, all algorithms must satisfy the following criteria:
➢ An algorithm is composed of a finite set of steps, each of which may require one or more operations.
➢ Algorithms produce one or more outputs and have zero or more inputs that are externally supplied.
➢ Each operation must be definite, meaning that it must be perfectly clear what should be done.
➢ They terminate after a finite number of operations.
➢ Algorithms that are definite and effective are also called computational procedures.
➢ A program is the expression of an algorithm in a programming language.
What is an Algorithm?
1. How to devise algorithms- Creating an algorithm is an art which may never be fully automated.
2. How to validate algorithms- Once an algorithm is devised, it is necessary to show that it computes the
correct answer for all possible legal inputs. We refer to this process as algorithm validation.
3. How to analyze algorithm- Analysis of algorithms or performance analysis refers to the task of determining
how much computing time and storage an algorithm requires.
4. How to test a program – Testing a program consists of two phases: debugging and profiling.
➢ Debugging is the process of executing programs on sample data sets to determine whether faulty results
occur and, if so, to correct them.
➢ Profiling or performance measurement is the process of executing a correct program on data sets and
measuring the time and space it takes to compute the results.
Algorithm specification:
1
2
3
Recursive Algorithm:
➢ A recursive function is a function that is defined in terms of it-self.
➢ Similarly, an algorithm is said to be recursive if the same algorithm is invoked in the body.
➢ An algorithm that calls itself is direct recursive.
➢ Algorithm A is said to be indirect recursive if it calls another algorithm which in turn calls A.
➢ Example: Towers of Hanoi.
Towers of Hanoi:
➢ The disks were of decreasing size and were stacked on the tower in decreasing order of size bottom to
top.
➢ Objective: Move the disks from tower X to tower Y using tower Z for intermediate storage.
➢ Rules:
1. As the disks are very heavy, they can be moved only one at a time.
2. No disk is on top of a smaller disk.
4
Towers of Hanoi- Recursive Algorithm
5
Figure: Pole1 is tower X Pole2 is tower Y & Pole3 is tower Z
Performance Analysis:
➢ The space complexity of an algorithm is the amount of memory it needs to run to completion.
➢ The time complexity of an algorithm is the amount of computer time it needs to run to completion.
Space Complexity:
6
Time Complexity:
➢ The time T(P)taken by a program P is the sum of the compile time and the run (or execution)time.
➢ The compile time does not depend on the instance characteristics.
➢ Time Complexity can be calculated with two methods:
1. Recurrence Relations.
2. Step count Method
➢ We may assume that a compiled program will be run several times without recompilation.
7
Example- Step Count Method:
Asymptotic Notations:
➢ Asymptotic analysis of algorithms is used to compare relative performance.
➢ Time complexity of an algorithm concerns determining an expression of the number of primitive
operations needed as a function of the problem size.
➢ Asymptotic analysis makes use of the following:
1. Big Oh Notation.
2. Big Omega Notation.
3. Big Theta Notation.
4. Little Oh Notation.
5. Little Omega Notation.
Big-Oh Notation:
➢ The big-Oh notation gives an upper bound on the growth rate of a function.
➢ Definition:
8
Example:
Let us consider f(n)= 3n+2 g(n)= n c= 4 and the big Oh notation relation is f(n)<=c* g(n)
3n+2 <= 4n for n>=3→ It satisfies the relation
n=3 → 3.3+2<= 4.3 → 11<=12 → True
n=4 → 3.4+2<= 4.4 → 12<=16 → True
Big-Omega Notation:
➢ The big-Omega notation gives an lower bound on the growth rate of a function.
➢ Definition:
9
Example:
Let us consider f(n)= 3n+2 g(n)= n c= 3 and the big Omega notation relation is f(n)>=c* g(n)
3n+2 >= 3n for n>=2→ It satisfies the relation
n=2 → 3.2+2>= 3.2 → 8>= 6→ True
n=3 → 3.3+2<= 4.2 → 9>=8 → True
Big-Theta Notation:
➢ Definition:
10
*************************
11
DESIGN AND ANALYSIS OF ALGORITHM
Unit 1.2 Topics: General method- Binary Search- Finding the maximum and minimum- Merge sort- Quick Sort-
Selection- Strassen's matrix multiplication
Computing Time:
1
Binary Search:
A binary search algorithm is a technique for finding a particular value in a sorted list.
Divide-and-conquer can be used to solve this problem.
Any given problem P gets divided into one new sub-problem. This division takes only O(1) time.
After a comparison the instance remaining to be solved by using this divide-and-conquer scheme again.
If the element is found in the list successful search
If the element is not found in the list unsuccessful search
The Time Complexity of Binary Search is:
2
Recursive Binary Search- Algorithm
3
Binary Search Time Complexity
4
Finding the Maximum And Minimum:
The divide and- conquer technique is to find the maximum and minimum items in a set of n elements in
the given list.
5
Finding the Maximum And Minimum- Divide & Conquer:
If the list contains only one element, then
Maximum=Minimum= a[1] element only.
If the list contains two elements, then compare these two elements & find maximum and minimum.
If the list contains more than two elements, then divide the given list into sub lists based on middle
value.
Recursively perform this process until minimum & maximum value is found in the given list.
Time Complexity = O(n)
Algorithm:
6
7
Merge sort:
In Merge Sort, the elements are to be sorted in non decreasing order.
Given a sequence of n elements(also called keys) a[l],,.a[n]the general idea is to imagine them split into
two sets a[l]….a[n/2]and a[[n/2]+1]….a[n].
Each set is individually sorted, and the resulting sorted sequences are merged to produce a single sorted
sequence of n elements.
8
Merge- Algorithm:
9
Merge sort- Time Complexity
10
Quick Sort:
In quick sort, the division into two sub arrays is made so that the sorted sub arrays do not need to be
merged later.
Three Steps involved here is:
1. Partitioning given array into sub arrays.
2. Interchanging two elements in the array.
3. Searching the input element.
Time Complexity = O(n*log n)
11
Quick Sort Algorithm:
12
13
Quick sort- Time Complexity:
14
Selection:
Selection is used to find kth smallest element and place kth position in the given array of n elements say
a[1:n].
j is the position in which k element is there.
Here we are partitioning the given array into three parts (say k is small element):
1. Sub array which contains 1 to j-1 elements.
2. if k=j => element is found in jth position.
3. Sub array which contains (n-j) elements.
This partitioning process continues until we found kth smallest element.
Time Complexity= O(n2)
15
Selection- Algorithm
Selection- Example
16
17
Strassen's matrix multiplication:
18
Strassen's matrix multiplication- Formula:
19
20
Strassen's matrix multiplication- Time Complexity:
*****************
21
DESIGN AND ANALYSIS OF ALGORITHM – GREEDY METHOD
Unit 2.1 Topics: General method- Knapsack Problem- Job scheduling with deadlines- Minimum cost spanning
trees- Optimal storage on tapes- Single source shortest path
Greedy Algorithm:
Knapsack Problem:
1
Formally the problem can be stated as:
2
Knapsack Problem-Example:
3
Types of Knapsack Problem:
4
Fractional Knapsack- Example Problem:
Item A B C D
Profit 280 100 120 120
Weight 40 10 20 24
Pi/ Wi 7 10 6 5
Arranging the above the tables with descending order of Pi/ Wi
Item B A C D
Profit 100(P1) 280(P2) 120(P3) 120(P4)
Weight 10 (W1) 40(W2) 20(W3) 24(W4)
Pi/ Wi 10 7 6 5
Consider the knapsack capacity W=60
Conditions:
To complete a job, one has to process the job on a machine for one unit of time.
Only one machine is available for processing jobs.
A feasible solution for this problem is a subset J of jobs such that each job in this subset can be completed by its
deadline.
An optimal solution is a feasible solution with maximum value.
5
Algorithm for Job Scheduling:
6
Example1:
Let number of jobs= n=3
(p1, p2, p3, p4)= (100, 10, 15, 27) deadline=1 job should be done on first day
(d1, d2, d3, d4)= (2, 1, 2,1) deadline=2 job should be done on first day or second day
Here maximum deadline=2 means only two jobs can be done per day & no parallel execution of jobs is done.
Feasible Processing Value Explanation
Solution Sequence
(1, 2) 2,1 110 2’s deadline <1’s deadline
(1, 3) 1,3 or 3,1 115 1’s deadline = 3’s deadline
(1,4) 4,1 127 4’s deadline <1’s deadline
(Maximum Profit)
(2, 3) 2, 3 25 2’s deadline < 3’s deadline
(2,4) Impossible because of parallel execution Both are having deadline=1
(3, 4) 4, 3 42 4’s deadline <3’s deadline
(1) 1 100
(2) 2 10
(3) 3 15
(4) 4 27
Example2:
7
1. Prim’s Algorithm:
In this algorithm we choose a neighboring or adjacent vertex for finding minimum cost spanning tree.
Example:
8
9
2. Kruskal Algorithm:
In this algorithm we list out costs between vertices in ascending order & add one by one vertex to spanning tree which
doesn’t forms any cycles.
Example:
10
11
Optimal storage on tapes:
Example:
Let number of inputs=3
(l1, l2, l3)= (5, 10, 3)
For n number of jobs we have n! possible orderings.
For 3 jobs we have 3! = 6 possible orderings.
Ordering I d(I) MRT
1, 2, 3 5 +(5+10)+(5+10+3) 38
1, 3, 2 5 +(5+3)+(5+3+10) 31
2, 1, 3 10+(10+5)+(10+5+3) 43
2, 3, 1 10+(10+3)+(10+3+5) 41
3, 1 ,2 3+(3+5)+(3+5+10) 29
3, 2,1 3+(3+10)+(3+10+5) 34
The optimal ordering is 3, 1, 2 is having minimum MRT value as 29.
Single-Source Shortest Paths:
Graphs can be used to represent the highway structure of a state or country with vertices representing cities and
edges representing sections of highway.
The edges can then be assigned weights which may be either the distance between the two cities connected by the
edge or the average time to drive along that section of highway.
12
Single-Source Shortest Paths- Algorithm:
Example1:
For the example, to reach source to destination (1 to 7) we have shortest path with value 42.
13
Example2:
*****************
14
DESIGN AND ANALYSIS OF ALGORITHM – DYNAMIC PROGRAMMING
Unit 2.2 Topics: General Method, Multistage graphs, All-pairs shortest paths, Optimal binary search trees, 0/1
knapsack, The traveling sales person problem.
Example:
In Dynamic Programming, an optimal sequence of decisions is obtained by making explicit appeal to The
Principle of Optimality.
Principle of Optimality states that an optimal sequence of decisions has the property that whatever the
initial state and decisions are, the remaining decisions must constitute an optimal decision sequence with
regard to the state resulting from the first decision.
Steps in Dynamic Programming:
1. Characterize the structure of optimal solution.
2. Recursively defines the value of optimal solution.
3. The optimal solution has to be constructed from information.
4. Compute an optimal solution from computed information.
1
Difference between Greedy Method & Dynamic Programming:
Multistage graphs:
2
Multistage graphs- Forward Approach- Example:
3
Multistage graphs- Backward Approach:
4
All-pairs shortest paths:
Example:
5
Optimal binary search trees:
A binary search tree is a tree where the key values are ordered that all the keys in the left sub tree are less
than the keys in the node, and all the keys in the right sub tree are greater. Clearly
LST<=ROOT<=RST
LST= Left Sub Tree & RST= Right Sub Tree
6
Optimal binary search trees- Algorithm:
7
8
9
10
0/1 knapsack:
Knapsack is a bag which has a capacity M.
In fractional knapsack, we place the items one by one by considering Pi/ Wi.
In 0/1 Knapsack, we cannot consider any fractions.
In Dynamic Programming, we consider knapsack problem for placing maximum elements with
maximum profit & weight that does not exceed knapsack capacity.
0 means we cannot consider that item.
1 means we consider that item to placed in knapsack.
The solution for Dynamic Programming is:
11
12
13
The traveling sales person problem:
Let G=(V,E) be a directed graph with edge costs Cij.
The variable Cij is defined such that
Cij >0 for all i & j.
Cij= ∞ if (i ,j ) € E.
Let |V|= n and assume that n>1.
A tour of G is a directed simple cycle that includes every vertex in V.
The cost of a tour is the sum of the cost of the edges on the tour.
The traveling sales person problem is to find a tour of minimum cost.
Notations:
g(i, s) = length of the shortest path starting at vertex i, going through all vertices in s and terminating at
vertex 1.
g(1, V-{1}) = length of an optimal sales person tour.
Principle of Optimality states that
Example:
14
15
*******************
16
DESIGN AND ANALYSIS OF ALGORITHM
Unit 3.1 Topics: Basic Traversal And Search Techniques- Techniques for Binary Trees- Techniques for Graphs-
Connected Components and Spanning Trees- Bi-connected components and DFS
1
Algorithm for Post-order traversal
2
Breadth First Search and Traversal (level by level traversing)
In breadth first search we start at a vertex v and mark it as having been reached (visited).
The vertex v is at this time said to be unexplored.
A vertex is said to have been explored by an algorithm when the algorithm has visited all vertices adjacent from it.
The newly visited vertices haven't been explored and are put onto the end of a list of unexplored vertices.
The first vertex on this list is the next to be explored.
Exploration continues until no unexplored vertex is left. The list of unexplored vertices operates as a queue and can
be represented using any of the standard queue representation.
3
Example for BFS
BFS Order: 1, 2, 3, 4, 5, 6, 7, 8
Applying BFS
DFS Order: 1, 2, 4, 8, 5, 6, 3, 7
Applying DFS
4
3.1.3 Connected Components and Spanning Trees
A graph is said to be connected if at least one path exists between every pair of vertices in the graph.
Two vertices are connected component if there exists a path between them.
A directed graph is said to be strongly connected if every two nodes in the graph are reachable from each other.
Spanning Trees
A spanning tree of an undirected graph of n nodes is a set of n-1 edges that connects all nodes.
A graph may have many spanning trees.
For finding minimum cost spanning trees we have two algorithms
1. Prim’s Algorithm
2. Kruskal Algorithm
5
Bi Connected Component
Example:
DFS
Depth First Spanning Trees are used to identify articulation points and bi connected components.
**************
6
DESIGN AND ANALYSIS OF ALGORITHM
Unit 3.2 Topics: Backtracking- General Method- 8 Queens Problem- Sum of subset problem- Graph coloring-
Hamiltonian Cycles- Knapsack Problem
1
2
Recursive Backtracking Algorithm:
3
Applications of Backtracking:
Backtracking method is applied to solve various problems like:
1. N Queens Problem
2. Sum of Subsets Problem
3. Graph Coloring
4. Hamiltonian Cycles
5. Knapsack Problem
4
4-Queens Problem –state space tree:
5
N-Queens Problem- algorithm1: Placing a new queen in kth row & ith column.
6
8-Queens Problem solution:
7
Sum of Subsets Problem-Algorithm
8
Sum of Subsets Problem-Example
9
3.2.4 Graph coloring:
Let G be a graph and m be a positive integer.
It is to find whether that nodes of G can be colored in such a way that no two adjacent nodes have the same color
yet only m colors are used where m is a chromatic number.
If d is degree of a given graph G, then it is colored with d+ 1 colors.
Degree means number of edges connected to that node.
10
Graph coloring- m coloring algorithm
11
Graph coloring- generating color algorithm
12
3.2.5 Hamiltonian Cycles
13
Hamiltonian Cycles-Finding all Hamiltonian Cycles Algorithm
14
Knapsack Problem- Bounding Function Algorithm
15
Knapsack Problem- Example
16
17
****************
18
UNIT-4 DESIGN AND ANALYSIS OF ALGORITHM
Branch and Bound: The method, Travelling salesperson, 0/1 Knapsack problem.
Lower Bound Theory: Comparison trees, Lower bounds through reductions – Multiplying triangular matrices,
inverting a lower triangular matrix, computing the transitive closure
1
4.1.1. FIFO (First- In- First- Out) Search or BFS:
2
4.1.3. Least Count (LC) Search:
3
4.2 Travelling Sales Person using B & B
4
TSP Example:
5
6
7
8
9
10
11
12
4.3 0/1 Knapsack problem using B & B
13
14
15
16
4.4 Lower Bound Theory - Comparison trees
17
18
19
20
4.5 Multiplying triangular matrices
21
4.6 Inverting a Lower Triangular Matrix:
22
23
4.7 Computing the Transitive Closure
24
*******************************
25
UNIT-5 DESIGN AND ANALYSIS OF ALGORITHM (P, NP, NP-COMPLETE, NP-HARD PROBLEMS)
Introduction to Problems:
Types of Algorithms
1
2
3
Cook’s Theorem:
4
5
Satifiability problem
6
3 CNF Satifiability problem
7
8
9
10
11
**********************
12