Algorithm Design
Algorithm Design
Gnanavel R
1
BITS Pilani
Pilani|Dubai|Goa|Hyderabad
Dynamic Programming
2
Dynamic programming;
There are two ways for solving sub problems while caching
the results:
Top-down approach: start with the original problem(F(n) in
this case), and recursively solving smaller and smaller
cases(F(i)) until we have all the ingredient to the original
problem.
Fibonacci (n) = 1; if n = 0
Fibonacci (n) = 1; if n = 1
Fibonacci (n) = Fibonacci(n-1) + Fibonacci(n-2)
So, the first few numbers in this series will be: 1, 1, 2, 3, 5, 8, 13, 21...
and so on!
Algorithm Fib(n) void fib () {
int fib (int n) { fibresult[0] = 0;
fibresult[1] = 1;
if (n < 2)
for (int i = 2; i<n; i++)
return 1;
fibresult[i] = fibresult[i-1] + fibresult[i-2];
return fib(n-1) + fib(n-2); }
}
Algorithm Fib(n)
int fib (int n) {
if (n < =1)
F[0]=0; F[1]=1
for i = 2 to n
F[i]=F[i-2]+F[i-1];
return F[n]
}
10
i = i
* k
A B C
rows[A] 11
rows[A]
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
12
17
19
20
Dynamic Programming
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Use Dynamic Programming
21
22
23
24
26
27
0 if i = j
m[i, j] = min {m[i, k] + m[k+1, j] + p i-1pkpj} if i < j
ik<j
28
2
1
29
j
3
2
1
30
Overlapping Subproblems
– If a recursive algorithm revisits the same subproblems over and over the
problem has overlapping subproblems
31
32
Matrix multiplication:
– (n2) subproblems (1 i j n)
– At most n-1 choices
(n3) overall
33
It means, that the best subset of Sk that has total weight w is:
1) the best subset of Sk-1 that has total weight
w, or
2) the best subset of Sk-1 that has total weight
w-wk plus the item k
Recursive Formula
V [k − 1, w] if wk w
V [ k , w] =
max{V [k − 1, w],V [k − 1, w − wk ] + bk } else
for w = 0 to W
V[0,w] = 0
for i = 1 to n
V[i,0] = 0
for i = 1 to n
for w = 0 to W
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Running time
for w = 0 to W
O(W)
V[0,w] = 0
for i = 1 to n
V[i,0] = 0
for i = 1 to n Repeat n times
for w = 0 to W O(W)
< the rest of the code >
What is the running time of this algorithm?
O(n*W)
Remember that the brute-force algorithm
takes O(2n)
Page 49
Page 50
Example
n = 4 (# of elements)
W = 5 (max weight)
Elements (weight, benefit):
(2,3), (3,4), (4,5), (5,6)
Example (2)
i\W 0 1 2 3 4 5
0 0 0 0 0 0 0
1
2
3
4
for w = 0 to W
V[0,w] = 0
Example (3)
i\W 0 1 2 3 4 5
0 0 0 0 0 0 0
1 0
2 0
3 0
4 0
for i = 1 to n
V[i,0] = 0
Example (4) Items:
1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=1 4: (5,6)
0 0 0 0 0 0 0 bi=3
1 0 0
wi=2
2 0
w=1
3 0
w-wi =-1
4 0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Example (5) Items:
1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=1 4: (5,6)
0 0 0 0 0 0 0 bi=3
1 0 0 3
wi=2
2 0
w=2
3 0
w-wi =0
4 0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Example (6) Items:
1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=1 4: (5,6)
0 0 0 0 0 0 0 bi=3
1 0 0 3 3
wi=2
2 0
w=3
3 0
w-wi =1
4 0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Example (7) Items:
1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=1 4: (5,6)
0 0 0 0 0 0 0 bi=3
1 0 0 3 3 3
wi=2
2 0
w=4
3 0
w-wi =2
4 0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Example (8) Items:
1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=1 4: (5,6)
0 0 0 0 0 0 0 bi=3
1 0 0 3 3 3 3
wi=2
2 0
w=5
3 0
w-wi =3
4 0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Example (9) Items:
1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=2 4: (5,6)
0 0 0 0 0 0 0 bi=4
1 0 0 3 3 3 3
wi=3
2 0 0
w=1
3 0
w-wi =-2
4 0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Example (10) Items:
1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=2 4: (5,6)
0 0 0 0 0 0 0 bi=4
1 0 0 3 3 3 3
wi=3
2 0 0 3
w=2
3 0
w-wi =-1
4 0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Example (11) Items:
1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=2 4: (5,6)
0 0 0 0 0 0 0 bi=4
1 0 0 3 3 3 3
wi=3
2 0 0 3 4
w=3
3 0
w-wi =0
4 0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Example (12) Items:
1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=2 4: (5,6)
0 0 0 0 0 0 0 bi=4
1 0 0 3 3 3 3
wi=3
2 0 0 3 4 4
w=4
3 0
w-wi =1
4 0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Example (13) Items:
1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=2 4: (5,6)
0 0 0 0 0 0 0 bi=4
1 0 0 3 3 3 3
wi=3
2 0 0 3 4 4 7
w=5
3 0
w-wi =2
4 0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Example (14) Items:
1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=3 4: (5,6)
0 0 0 0 0 0 0 bi=5
1 0 0 3 3 3 3
wi=4
2 0 0 3 4 4 7
w= 1..3
3 0 0 3 4
4 0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Example (15) Items:
1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=3 4: (5,6)
0 0 0 0 0 0 0 bi=5
1 0 0 3 3 3 3
wi=4
2 0 0 3 4 4 7
w= 4
3 0 0 3 4 5
w- wi=0
4 0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Example (16) Items:
1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=3 4: (5,6)
0 0 0 0 0 0 0 bi=5
1 0 0 3 3 3 3
wi=4
2 0 0 3 4 4 7
w= 5
3 0 0 3 4 5 7
w- wi=1
4 0
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Example (17) Items:
1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=4 4: (5,6)
0 0 0 0 0 0 0 bi=6
1 0 0 3 3 3 3
wi=5
2 0 0 3 4 4 7
w= 1..4
3 0 0 3 4 5 7
4 0 0 3 4 5
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Example (18) Items:
1: (2,3)
2: (3,4)
3: (4,5)
i\W 0 1 2 3 4 5 i=4 4: (5,6)
0 0 0 0 0 0 0 bi=6
1 0 0 3 3 3 3
wi=5
2 0 0 3 4 4 7
w= 5
3 0 0 3 4 5 7
w- wi=0
4 0 0 3 4 5 7
if wi <= w // item i can be part of the solution
if bi + V[i-1,w-wi] > V[i-1,w]
V[i,w] = bi + V[i-1,w- wi]
else
V[i,w] = V[i-1,w]
else V[i,w] = V[i-1,w] // wi > w
Comments
• This algorithm only finds the max possible value that can be
carried in the knapsack
– i.e., the value in V[n,W]
• To know the items that make this maximum value, an addition
to this algorithm is necessary
How to find actual Knapsack Items
78
1
BITS Pilani
Pilani|Dubai|Goa|Hyderabad
Huffman Coding
2
Text Compression
Question
Is it possible to find coding scheme, in which
fewer bits are used.
Answer – Variable length codes
• Alphabets that occur more frequently should
be encoded using short bit strings &
rarely occurring alphabets should be encoded
using long bit strings
a b c d e f
Frequency 45 13 12 16 9 5
Fixed 000 001 010 011 100 101
length
Variable 0 101 100 111 1101 1100
length
• Important
In variable length code, some method must be
used to determine where the bits for each
character start and end.
For Example: If e : 0, a : 1, t : 01
Then 0101 could correspond to eat, tea or eaea.
A prefix code is a binary code such that no code
word is the prefix of another code-word
00 010 011 10 11
a b c d e
a d e
b c
– 1 char = 1 byte, be it e x
or
CS 102
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
The (Real) Basic Algorithm
1. Scan text to be compressed and tally occurrence of all
characters.
2. Sort or prioritize characters based on number of
occurrences in text.
3. Build Huffman code tree based on prioritized list.
4. Perform a traversal of tree to determine all code words.
5. Scan text again and create new file using the Huffman
codes.
CS 102
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
The Huffman Coding algorithm-
History
Algorithm HuffmanEncoding(X)
• It runs in time Input string X of size n
O(n + d log d), Output optimal encoding trie for X
where n is the size C distinctCharacters(X)
of X and d is the computeFrequencies(C, X)
Q new empty heap
number of distinct for all c C
characters of X T new single-node tree storing c
• A heap-based Q.insert(getFrequency(c), T)
priority queue is while Q.size() > 1
f1 Q.minKey()
used as an
T1 Q.removeMin()
auxiliary structure f2 Q.minKey()
T2 Q.removeMin()
T join(T1, T2)
Q.insert(f1 + f2, T)
return Q.removeMin()
X = abracadabra a 6
Frequencies 2 4
a b c d r
c d b r
5 2 1 1 2
6
2 4
a b c d r
5 2 1 1 2 a c d b r
5
2 2 4
a b c d r a c d b r
5 2 2 5
BITS Pilani, Pilani Campus
Huffman Tree Example
E e r i space
ysnarlk.
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
Prioritize characters
Uses binary tree nodes
public class HuffNode
{
public char myChar;
public int myFrequency;
public HuffNode myLeft, myRight;
}
priorityQueue myQueue;
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
The queue after inserting all nodes
E i y l k . r s n a sp e
1 1 1 1 1 1 2 2 2 2 4 8
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
While priority queue contains two or more nodes
– Create new node
– Dequeue node and make it left subtree
– Dequeue next node and make it right subtree
– Frequency of new node equals sum of frequency of left and right children
– Enqueue new node back into queue
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
E i y l k . r s n a sp e
1 1 1 1 1 1 2 2 2 2 4 8
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
y l k . r s n a sp e
1 1 1 1 2 2 2 2 4 8
E i
1 1
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
y l k . r s n a sp e
2
1 1 1 1 2 2 2 2 4 8
E i
1 1
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
k . r s n a sp e
2
1 1 2 2 2 2 4 8
E i
1 1
y l
1 1
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
2
k . r s n a 2 sp e
1 1 2 2 2 2 4 8
y l
1 1
E i
1 1
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
r s n a 2 2 sp e
2 2 2 2 4 8
y l
E i 1 1
1 1
k .
1 1
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
r s n a 2 2 sp e
2
2 2 2 2 4 8
E i y l k .
1 1 1 1 1 1
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
n a 2 sp e
2 2
2 2 4 8
E i y l k .
1 1 1 1 1 1
r s
2 2
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
n a 2 sp e
2 2 4
2 2 4 8
E i y l k . r s
1 1 1 1 1 1 2 2
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
2 4 e
2 2 sp
8
4
y l k . r s
E i 1 1 1 1 2 2
1 1
n a
2 2
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
2 4 4 e
2 2 sp
8
4
y l k . r s n a
E i 1 1 1 1 2 2 2 2
1 1
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
4 4 e
2 sp
8
4
k . r s n a
1 1 2 2 2 2
2 2
E i y l
1 1 1 1
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
4 4 4
2 sp e
4 2 2 8
k . r s n a
1 1 2 2 2 2
E i y l
1 1 1 1
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
4 4 4
e
2 2 8
r s n a
2 2 2 2
E i y l
1 1 1 1
2 sp
4
k .
1 1
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
4 4 4 6 e
2 sp 8
r s n a 2 2
4
2 2 2 2 k .
E i y l 1 1
1 1 1 1
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
4 6 e
2 2 2 8
sp
4
E i y l k .
1 1 1 1 1 1
8
4 4
r s n a
2 2 2 2
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
4 6 e 8
2 2 2 8
sp
4 4 4
E i y l k .
1 1 1 1 1 1
r s n a
2 2 2 2
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
8
e
8
4 4
10
r s n a
2 2 2 2 4
6
2 2
2 sp
4
E i y l k .
1 1 1 1 1 1
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
8 10
e
8 4
4 4
6
2 2
r s n a 2 sp
2 2 2 2 4
E i y l k .
1 1 1 1 1 1
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
10
16
4
6
2 2 e 8
2 sp 8
4
E i y l k . 4 4
1 1 1 1 1 1
r s n a
2 2 2 2
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
10 16
4
6
e 8
2 2 8
2 sp
4 4 4
E i y l k .
1 1 1 1 1 1
r s n a
2 2 2 2
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
26
16
10
4 e 8
6 8
2 2
2 sp 4 4
4
E i y l k .
1 1 1 1 1 1 r s n a
2 2 2 2
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
After enqueueing this node
there is only one node left in
priority queue.
26
16
10
4 e 8
6 8
2 2 2 sp 4 4
4
E i y l k .
1 1 1 1 1 1 r s n a
2 2 2 2
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Building a Tree
Dequeue the single node left in the
queue.
26
16
10
4 e 8
6 8
2 2 2 sp 4 4
4
E i y l k .
1 1 1 1 1 1 r s n a
2 2 2 2
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Encoding the File
Traverse Tree for Codes
Char Code
E 0000
i 0001
y 0010
l 0011 26
k 0100
. 0101 16
10
space 011
e 10 4
6
e 8
r 1100 8
2 2
s 1101 2 sp 4 4
n 1110 4
E i y l k .
a 1111 1 1 1 1 1 1 r s n a
2 2 2 2
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Encoding the File
Rescan text and encode file using new code words
Eerie eyes seen near lake. Char Code
E 0000
i 0001
y 0010
l 0011
000010110000011001110001010110110100
k 0100
111110101111110001100111111010010010
. 0101
space 011
1 e 10
r 1100
s 1101
n 1110
a 1111
• Why is there no need
for a separator
character?
.
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Decoding the File
How does receiver know what the codes are?
Tree constructed for each text file.
– Considers frequency for each file
– Big hit on compression, especially for smaller files
Tree predetermined
– based on statistical analysis of text files or file types
Data transmission is bit based versus byte based
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Decoding the File
Once receiver has tree it scans incoming bit
stream 26
0 go left 10
16
1 go right 4 e 8
6 8
2 2 2 sp 4 4
4
101000110111101111011 E i y l k .
1 1 1 1 1 1 r s n a
11110000110101 2 2 2 2
CS 102
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Algorithm Design
Gnanavel R
1
BITS Pilani
Pilani|Dubai|Goa|Hyderabad
2
Optimization Problem
4
The Greedy Method
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Greedy Method
11
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Kruskal’s Algorithm
A priority queue stores
the edges outside the
cloud
◼ Key: weight
◼ Element: edge
At the end of the
algorithm
◼ We are left with one
cloud that encompasses
the MST
◼ A tree T which is our
MST
12
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Kruskal’s Algorithm
13
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Kruskal’s Algorithm
14
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Kruskal’s Algorithm
15
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Kruskal’s Algorithm
16
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Partition-Based Implementation
A partition-based version of Kruskal’s Algorithm performs
cloud merges as unions and tests as finds.
Algorithm Kruskal(G):
Input: A weighted graph G.
Output: An MST T for G.
Let C be a cluster of the vertices of G, where each vertex forms a separate cluster.
Let Q be a priority queue storing the edges of G, sorted by their weights
Let T be an initially-empty tree
while Q is not empty do
(u,v) Q.removeMinElement()
if C.find(u) != C.find(v) then Running time: O((n+m)log n)
Add (u,v) to T
C.union(u,v) //Merge two cluster
return T 17
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Kruskal
Example 2704
BOS
867
849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
18
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example 2704
867 BOS
849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
19
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example 2704
867 BOS
849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
20
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example 2704
867 BOS
849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
21
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example 2704
867 BOS
849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
22
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example 2704
867 BOS
849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
23
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example 2704
867 BOS
849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
24
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example 2704
867 BOS
849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
25
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example 2704
867 BOS
849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
26
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example 2704
867 BOS
849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
27
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example 2704
867 BOS
849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
28
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example 2704
867 BOS
849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
29
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example 2704
867 BOS
849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
30
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example
2704
867 BOS
849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
31
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Prim-Jarnik’s Algorithm
Similar to Dijkstra’s algorithm (for a connected graph)
We pick an arbitrary vertex s and we grow the MST as a
cloud of vertices, starting from s
We store with each vertex v a label d(v) = the smallest
weight of an edge connecting v to a vertex in the cloud
At each step:
◼ We add to the cloud the vertex
u outside the cloud with the
smallest distance label
◼ We update the labels of the
vertices adjacent to u
32
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Prim-Jarnik’s Algorithm (cont.)
A priority queue stores the Algorithm PrimJarnikMST(G)
vertices outside the cloud Q new heap-based priority queue
s a vertex of G
– Key: distance for all v G.vertices()
– Element: vertex if v = s
setDistance(v, 0)
Locator-based methods else
– insert(k,e) returns a locator setDistance(v, )
– replaceKey(l,k) changes the setParent(v, )
l Q.insert(getDistance(v), v)
key of an item setLocator(v,l)
We store three labels with each while Q.isEmpty()
vertex: u Q.removeMin()
for all e G.incidentEdges(u)
– Distance z G.opposite(u,e)
– Parent edge in MST r weight(e)
– Locator in priorit queue if r getDistance(z)
setDistance(z,r)
setParent(z,e)
Q.replaceKey(getLocator(z),r)
33
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Prim-Jarnik’s Algorithm (cont.)
34
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Prim-Jarnik’s Algorithm (cont.)
35
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Prim-Jarnik’s Algorithm (cont.)
36
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Prim-Jarnik’s Algorithm (cont.)
37
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example
7
7 D 7 D
2 2
B 4 B 4
8 9 5 9
2 5 F 2 5 F
C C
8 8
8 3 8 3
E E
A 7 A 7
0 7 0 7
7 7
7 D D
2 2 7
B 4 B 4
5 9 9 4
2 5 F 5 5
C 2 F
8 C
3 8
8 8 3
E E
A 7 A
0 7 7 7
0
38
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example (contd.)
7
7 D
2
B 4
5 9 4
2 5 F
C
8
8 3
E
A 3 7
0 7
7 D
2
B 4
5 9 4
2 5 F
C
8
8 3
E
A 3
0 7
39
Minimum Spanning Trees
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Shortest Path Problem
Given a weighted graph and two vertices u and v, we want to find a
path of minimum total weight between u and v.
– Length of a path is the sum of the weights of its edges.
Applications
– Internet packet routing
– Flight reservations
– Driving directions
PVD
ORD
SFO
LGA
HNL
LAX
DFW
MIA 40
Shortest Paths
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Shortest Path Properties
Property 1:
A subpath of a shortest path is itself a shortest path
Property 2:
There is a tree of shortest paths from a start vertex to all the other vertices
Example:
Tree of shortest paths from a vertex
PVD
ORD
SFO
LGA
HNL
LAX
DFW
MIA
41
Shortest Paths
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Dijkstra’s Algorithm
The distance of a vertex v from a We grow a “cloud” of vertices,
vertex s is the length of a beginning with s and eventually
shortest path between s and v covering all the vertices
Dijkstra’s algorithm computes the We store with each vertex v a
distances of all the vertices label d(v) representing the
from a given start vertex s distance of v from s in the
Assumptions: subgraph consisting of the
– the graph is connected cloud and its adjacent vertices
– the edges are undirected At each step
– the edge weights are – We add to the cloud the vertex
nonnegative u outside the cloud with the
smallest distance label, d(u)
– We update the labels of the
vertices adjacent to u
42
Shortest Paths
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Edge Relaxation
Consider an edge e = (u,z) such that
– u is the vertex most recently added to the cloud
– z is not in the cloud d(u) = 50
d(z) = 75
u e
s z
The relaxation of edge e
updates distance d(z) as follows:
d(z) min{d(z),d(u) + weight(e)}
d(u) = 50
d(z) = 60
u e
s z
43
Shortest Paths
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Dijkstra’s Algorithm
Dijkstra(G)
for each v V
d[v] = ;
d[s] = 0; S = ; Q = V;
while (Q )
u = ExtractMin(Q);
S = S U {u};
for each v u->Adj[]
if (d[v] > d[u]+w(u,v)) Relaxation
Note: this d[v] = d[u]+w(u,v);
is really a
Step
call to Q->DecreaseKey() David Luebke
44
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
45
3 9 5 3 9 8
2 5 2 5
E F E F
0 0
8 A 4 8 A 4
2 2
8 7 2 1 3 7 7 2 1 3
B C D B C D
5 3 9 11 5 3 9 8
2 5 2 5
E F E F 50
Shortest Paths
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example (cont.)
0
8 A 4
2
7 7 2 1 3
B C D
5 3 9 8
2 5
E F
0
8 A 4
2
7 7 2 1 3
B C D
5 3 9 8
2 5
E F
51
Shortest Paths
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Why It Doesn’t Work for Negative-
Weight Edges
Dijkstra’s algorithm is based on the greedy
method. It adds vertices by increasing distance.
0
– If a node with a negative 8 A 4
incident edge were to be 6
added late to the cloud, it 7 7 5 1 4
B C D
could mess up distances
for vertices already in the 5 0 -8 9
2 5
cloud. E F
52
– Objective: maximize
b (x / w )
iS
i i i
– Constraint:
x
iS
i W
54
The Greedy Method
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example
Given: A set S of n items, with each item i having
– bi - a positive benefit
– wi - a positive weight
Goal: Choose items with maximum total benefit but with
weight at most W.
“knapsack”
Solution:
• 1 ml of 5
Items:
1 2 3 4 5
• 2 ml of 3
• 6 ml of 4
Weight: 4 ml 8 ml 2 ml 6 ml 1 ml • 1 ml of 2
Benefit: $12 $32 $40 $30 $50 10 ml
Value: 3 4 20 5 50
55
($ per ml)
The Greedy Method
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
The Fractional Knapsack Algorithm
Algorithm fractionalKnapsack(S, W)
Input: set S of items w/ benefit bi and weight wi;
max. weight W
Output: amount xi of each item i to maximize benefit
with weight at most W
for each item i in S
xi 0
vi bi / wi {value}
w0 {total weight}
while w < W
remove item i with highest vi
xi min{wi , W − w}
w w + min{wi , W − w}
56
The Greedy Method
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example
Objects : 1 2 3 4 5 6 7
Profit: 5 10 15 7 8 9 4
Weight: 1 3 5 4 1 3 2
57
The Greedy Method
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example
Objects : 1 2 3 4 5 6 7
Profit: 5 10 15 7 8 9 4
Weight: 1 3 5 4 1 3 2
Total Weight or Capacity W=15
Number of objects or item n=7
58
The Greedy Method
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example
Objects : 1 2 3 4 5 6 7
Profit: 5 10 15 7 8 9 4
Weight: 1 3 5 4 1 3 2
Total Weight or Capacity W=15
Number of objects or item n=7
59
The Greedy Method
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example
Objects : 1 2 3 4 5 6 7
Profit: 5 10 15 7 8 9 4
Weight: 1 3 5 4 1 3 2
Total Weight or Capacity W=15
Number of objects or item n=7
60
The Greedy Method
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Example
61
65
a b V= {a,b,c,d,e}
E= {(a,b),(a,c),(a,d),
(b,e),(c,d),(c,e),(d,e)}
c
d e
BITS Pilani, Pilani Campus
Graph-Example
• Example:
– A vertex represents an airport and stores the three-
letter airport code
– An edge represents a flight route between two
airports and stores the mileage of the route
PVD
ORD
SFO
LGA
HNL
LAX
DFW
MIA
Joe
John
Tom Paul
• Databases Paul
David
– Entity-relationship diagram
• Directed edge
– ordered pair of vertices (u,v) flight
– first vertex u is the origin ORD PVD
– second vertex v is the destination AA 1206
– e.g., a flight
• Undirected edge 849
– unordered pair of vertices (u,v) ORD PVD
– e.g., a flight route miles
• Directed graph (Digraph)
– all the edges are directed
– e.g., flight network
• Undirected graph
– all the edges are undirected
– e.g., route network
7
BITS Pilani, Pilani Campus
Terminology:
• Degree of a Vertex
The degree of a vertex is the number of edges
incident to that vertex
• For directed graph,
• the in-degree of a vertex v is the number of edges
that have v as the head
• the out-degree of a vertex v is the number of edges
that have v as the tail
• if di is the degree of a vertex i in a graph G with n vertices and e edges,
the number of edges is
n −1 Why? Since adjacent vertices each
e=( 0
di ) / 2 count the adjoining edge, it will be
counted twice
1 in: 1, out: 2
2 in: 1, out: 0
G3 directed graph
BITS Pilani, Pilani Campus
Terminology (cont.)
• Path
– sequence of alternating vertices
and edges
– begins with a vertex V
a b
– ends with a vertex P1
– each edge is preceded and d
U X Z
followed by its endpoints P2 h
• Simple path c e
– path such that all its vertices and W g
edges are distinct
• Examples f
– P1=(V,b,X,h,Z) is a simple path Y
– P2=(U,c,W,e,X,g,Y,f,W,d,V) is a path
that is not simple
• Cycle
– circular sequence of
alternating vertices and edges V
a b
– each edge is preceded and
followed by its endpoints d
U X Z
• Simple cycle C2 h
– cycle such that all its vertices c
e C1
and edges are distinct W g
• Examples
– C1=(V,b,X,g,Y,f,W,c,U,a,) is a f
simple cycle Y
– C2=(U,c,W,e,X,g,Y,f,W,d,V,a,) is
a cycle that is not simple
Property 1 Notation
v deg(v) = 2m n number of vertices
Proof: m number of edges
each edge is counted
deg(v) degree of vertex v
twice
Property 2 Example
In an undirected graph ◼ n=4
with no self-loops and no ◼ m=6
multiple edges ◼ deg(v) = 3
m n (n − 1)2
Proof:
each vertex has degree at
most (n − 1)
Graphs 12
BITS Pilani, Pilani Campus
Even More Terminology
1 2
1 2 3 1 2
1 1 1 1
G3
2 (i) (ii) (iii) 2 (iv) 2
Some of the subgraph of G3
• Adjacency Matrix
• Adjacency Lists
2
1 2 3 4 5 6 7
1 3
1 0 1 0 0 1 1 0
2 1 0 1 0 0 0 1
3 0 1 0 1 0 0 0 7
4
0 0 1 0 1 0 1
5 4
1 0 0 1 0 1 1
6
1 0 0 0 1 0 0
7 6
0 1 0 1 1 0 0
5
19
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Merits of Adjacency Matrix
• From the adjacency matrix, to determine the
connection of vertices is easy
n −1
• The degree of a vertex is adj _ mat[i][ j ]
j =0
2
1 1 3
2 5 6
2 3 1 7
3
4
2 4 7
3 7 5
5
6 6 1 7 4 4
7 1 5
4 5 2 6
5
23
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Pros and Cons of Adjacency Lists
• Pros
– Saves on space (memory): the representation
takes as many memory words as there are nodes
and edge.
• Cons
– It can take up to O(n) time to determine if a pair of
nodes (i,j) is an edge: one would have to search
the linked list L[i], which takes time proportional
to the length of L[i].
CS 103 24
BITS Pilani, Pilani Campus
Adjacency Matrix: Pros and Cons
advantages
– fast to tell whether edge exists between any two vertices i and j (and
to get its weight)
disadvantages
– consumes a lot of memory on sparse graphs (ones with few edges)
– redundant information for undirected graphs
25
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Main Methods of the Graph ADT
Vertices and edges Update methods
– are positions – insertVertex(o)
– store elements – insertEdge(v, w, o)
Accessor methods – insertDirectedEdge(v, w, o)
– aVertex() – removeVertex(v)
– incidentEdges(v) – removeEdge(e)
– endVertices(e) Generic methods
– isDirected(e) – numVertices()
– origin(e) – numEdges()
– destination(e) – vertices()
– opposite(v, e) – edges()
– areAdjacent(v, w)
26
Graphs
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Graph traversals
• Graph traversal means visiting every vertex and edge exactly once
in a well-defined order.
• During a traversal, it is important that you track which vertices have
been visited.
mark s as visited.
while ( Q is not empty)
//Removing that vertex from queue,whose neighbour will be visited now
v = Q.dequeue( )
29
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Breadth-First Search
Breadth-first search (BFS) is a BFS on a graph with n vertices
general technique for and m edges takes O(n + m
traversing a graph ) time
A BFS traversal of a graph G
BFS can be further extended to
– Visits all the vertices and
edges of G solve other graph problems
– Determines whether G is – Find and report a path with
connected the minimum number of
– Computes the connected edges between two given
components of G vertices
– Computes a spanning forest – Find a simple cycle, if there
of G is one
30
11/7/2024 6:33 AM Breadth-First Search
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
BFS Algorithm
The algorithm uses a mechanism for Algorithm BFS(G, s)
setting and getting “labels” of L0 new empty sequence
vertices and edges L0.insertLast(s)
setLabel(s, VISITED)
i0
Algorithm BFS(G) while Li.isEmpty()
Input graph G Li +1 new empty sequence
Output labeling of the edges for all v Li.elements()
and partition of the for all e G.incidentEdges(v)
vertices of G if getLabel(e) = UNEXPLORED
for all u G.vertices() w opposite(v,e)
setLabel(u, UNEXPLORED) if getLabel(w) = UNEXPLORED
for all e G.edges() setLabel(e, DISCOVERY)
setLabel(w, VISITED)
setLabel(e, UNEXPLORED)
Li +1.insertLast(w)
for all v G.vertices() else
if getLabel(v) = UNEXPLORED setLabel(e, CROSS)
BFS(G, v) i i +1
31
11/7/2024 6:33 AM Breadth-First Search
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Example
L0
unexplored vertex A
A
A visited vertex L1
B C D
unexplored edge
discovery edge E F
cross edge
L0 L0
A A
L1 L1
B C D B C D
E F E F
32
11/7/2024 6:33 AM Breadth-First Search
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Example (cont.)
L0 L0
A A
L1 L1
B C D B C D
L2
E F E F
L0 L0
A A
L1 L1
B C D B C D
L2 L2
E F E F
33
11/7/2024 6:33 AM Breadth-First Search
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Example (cont.)
L0 L0
A A
L1 L1
B C D B C D
L2 L2
E F E F
L0
A
L1
B C D
L2
E F
34
11/7/2024 6:33 AM Breadth-First Search
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Properties
Notation A
Gs: connected component of s
B C D
Property 1
BFS(G, s) visits all the vertices and edges of Gs
E F
Property 2
The discovery edges labeled by BFS(G, s) form a spanning tree Ts of
Gs
L0
A
L1
B C D
L2
E F
35
11/7/2024 6:33 AM Breadth-First Search
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Analysis
Setting/getting a vertex/edge label takes O(1) time
Each vertex is labeled twice
– once as UNEXPLORED
– once as VISITED
Each edge is labeled twice
– once as UNEXPLORED
– once as DISCOVERY or CROSS
Each vertex is inserted once into a sequence Li
Method incidentEdges is called once for each vertex
BFS runs in O(n + m) time provided the graph is
represented by the adjacency list structure
– Recall that v deg(v) = 2m
36
11/7/2024 6:33 AM Breadth-First Search
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Applications
We can specialize the BFS traversal of a graph G
to solve the following problems in O(n + m) time
– Compute the connected components of G
– Compute a spanning forest of G
– Find a simple cycle in G, or report that G is a forest
– Given two vertices of G, find a path in G between
them with the minimum number of edges, or report
that no such path exists
37
11/7/2024 6:33 AM Breadth-First Search
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Depth First Search (DFS)
Here, the word backtrack means that when you are moving
forward and there are no more nodes along the current path,
you move backwards on the same path to find nodes to
traverse.
All the nodes will be visited on the current path till all the
unvisited nodes have been traversed after which the next path
will be selected.
38
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Use of a stack
40
Depth-First Search
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
DFS Algorithm
The algorithm uses a mechanism for
setting and getting “labels” of Algorithm DFS(G, v)
vertices and edges Input graph G and a start vertex v of G
Algorithm DFS(G) Output labeling of the edges of G
Input graph G in the connected component of v
as discovery edges and back edges
Output labeling of the edges of G
as discovery edges and setLabel(v, VISITED)
back edges for all e G.incidentEdges(v)
for all u G.vertices() if getLabel(e) = UNEXPLORED
setLabel(u, UNEXPLORED) w G.opposite(v,e)
for all e G.edges() if getLabel(w) = UNEXPLORED
setLabel(e, UNEXPLORED) setLabel(e, DISCOVERY)
for all v G.vertices() DFS(G, w)
if getLabel(v) = UNEXPLORED else
DFS(G, v) setLabel(e, BACK)
41
Depth-First Search
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Example
A unexplored vertex A
A visited vertex
B D E
unexplored edge
discovery edge
C
back edge
A A
B D E B D E
C C
42
Depth-First Search
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Example (cont.)
A A
B D E B D E
C C
A A
B D E B D E
C C
43
Depth-First Search
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Properties of DFS
Property 1
DFS(G, v) visits all the vertices and edges in the
connected component of v
Property 2
The discovery edges labeled by DFS(G, v) form a
spanning tree of the connected component of v
A
B D E
C
44
Depth-First Search
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Analysis of DFS
Setting/getting a vertex/edge label takes O(1) time
Each vertex is labeled twice
– once as UNEXPLORED
– once as VISITED
Each edge is labeled twice
– once as UNEXPLORED
– once as DISCOVERY or BACK
Method incidentEdges is called once for each vertex
DFS runs in O(n + m) time provided the graph is
represented by the adjacency list structure
– Recall that v deg(v) = 2m
45
Depth-First Search
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Graph Traversals
46
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
DFS with Timestamp
DFS(G)
1. for each vertex u ∈ G.V
2. u.color = WHITE
3. u.pi = NIL
4. time = 0
5. for each vertex u ∈ G.V
6. if u.color == WHITE
7. DFS-VISIT(G,u)
DFS-VISIT(G,u)
1. time = time + 1
2. u.d = time
3. u.color = GRAY
4. for each v ∈ G.Adj[u]
5. if v.color == WHITE
6. v.pi = u
7. DFS-VISIT(G,v)
8. u.color = BLACK
9. time = time + 1
10. u.f = time
47
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
DFS vs. BFS
Biconnected components
L0
A A
L1
B C D B C D
L2
E F E F
DFS BFS
48
11/7/2024 6:33 AM Breadth-First Search
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
DFS vs. BFS (cont.)
Back edge (v,w) Cross edge (v,w)
– w is an ancestor of v in the – w is in the same level as v
tree of discovery edges or in the next level in the
tree of discovery edges
L0
A A
L1
B C D B C D
L2
E F E F
DFS BFS
49
11/7/2024 6:33 AM Breadth-First Search
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Strong connectivity in O(n(n+m))
50
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Strong Connectivity
Algorithm
Pick a vertex v in G.
a
Perform a DFS from v in G. G: g
c
– If there’s a w not visited, print
“no”. d
e
Let G’ be G with edges reversed.
b
Perform a DFS from v in G’. f
d
e
Running time: O(n+m). b
f
51
Directed Graphs
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
BFS
52
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
DFS
53
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
TOPOLOGICAL SORTING
54
BITS Pilani, Deemed to be University under Section
BITS Pilani,
3 of UGC
PilaniAct,
Campus
1956
Algorithm Design
Gnanavel R
1
BITS Pilani
Pilani|Dubai|Goa|Hyderabad
2
Divide-and-Conquer
Divide-and conquer is a general algorithm design
paradigm:
– Divide: divide the input data S in two or more disjoint subsets S1,
S2, …
– Recur: solve the subproblems recursively
– Conquer: combine the solutions for S1, S2, …, into a solution for S
Binary search compares the target value to the middle element of the array. If
they are not equal, the half in which the target cannot lie is eliminated and the
search continues on the remaining half
7
Divide-and-Conquer
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Merge Sort
MergeSort(A, left, right) {
if (left < right) {
mid = floor((left + right) / 2);
MergeSort(A, left, mid);
MergeSort(A, mid+1, right);
Merge(A, left, mid, right);
}
}
8
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Merging Two Sorted Sequences
The conquer step of merge-sort Algorithm merge(A, B)
consists of merging two sorted Input sequences A and B with
sequences A and B into a n2 elements each
sorted sequence S containing Output sorted sequence of A B
the union of the elements of A
and B S empty sequence
Merging two sorted sequences, while A.isEmpty() B.isEmpty()
each with n2 elements and if A.first().element() < B.first().element()
implemented by means of a S.insertLast(A.remove(A.first()))
doubly linked list, takes O(n) else
time S.insertLast(B.remove(B.first()))
while A.isEmpty()
S.insertLast(A.remove(A.first()))
while B.isEmpty()
S.insertLast(B.remove(B.first()))
return S
9
7 29 4 → 2 4 7 9
72 → 2 7 94 → 4 9
7 2 9 43 8 6 1 → 1 2 3 4 6 7 8 9
7 2 9 4 → 2 4 7 9 3 8 6 1 → 1 3 8 6
7 2 → 2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 2 9 43 8 6 1 → 1 2 3 4 6 7 8 9
7 29 4→ 2 4 7 9 3 8 6 1 → 1 3 8 6
7 2 → 2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 2 9 43 8 6 1 → 1 2 3 4 6 7 8 9
7 29 4→ 2 4 7 9 3 8 6 1 → 1 3 8 6
72→2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 2 9 43 8 6 1 → 1 2 3 4 6 7 8 9
7 29 4→ 2 4 7 9 3 8 6 1 → 1 3 8 6
72→2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 2 9 43 8 6 1 → 1 2 3 4 6 7 8 9
7 29 4→ 2 4 7 9 3 8 6 1 → 1 3 8 6
72→2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 2 9 43 8 6 1 → 1 2 3 4 6 7 8 9
7 29 4→ 2 4 7 9 3 8 6 1 → 1 3 8 6
72→2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 2 9 43 8 6 1 → 1 2 3 4 6 7 8 9
7 29 4→ 2 4 7 9 3 8 6 1 → 1 3 8 6
72→2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 2 9 43 8 6 1 → 1 2 3 4 6 7 8 9
7 29 4→ 2 4 7 9 3 8 6 1 → 1 3 8 6
72→2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 2 9 43 8 6 1 → 1 2 3 4 6 7 8 9
7 29 4→ 2 4 7 9 3 8 6 1 → 1 3 6 8
72→2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
7 2 9 43 8 6 1 → 1 2 3 4 6 7 8 9
7 29 4→ 2 4 7 9 3 8 6 1 → 1 3 6 8
72→2 7 9 4 → 4 9 3 8 → 3 8 6 1 → 1 6
b if n 2
T (n) =
2T (n / 2) + bn if n 2
We can therefore analyze the running time of merge-sort by finding a
solution to the above equation.
– That is, a solution that has T(n) only on the left-hand side.
22
T (n ) = 2T (n / 2) + bn
= 2(2T (n / 22 )) + b(n / 2)) + bn
= 22 T (n / 22 ) + 2bn
= 23 T (n / 23 ) + 3bn
Note that base, T(n)=b, case occurs
when 2i=n. That is, i = log n. = 24 T (n / 24 ) + 4bn
So, = ...
Thus, T(n) is O(n log n). = 2i T (n / 2i ) + ibn
T ( n ) = bn + bn log n
23
Divide-and-Conquer
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Analysis of Merge-Sort
The height h of the merge-sort tree is O(log n)
– at each recursive call we divide in half the sequence,
The overall amount or work done at the nodes of depth i is O(n)
– we partition and merge 2i sequences of size n2i
– we make 2i+1 recursive calls
Thus, the total running time of merge-sort is O(n log n)
1 2 n2
i 2i n2i
… … …
24
Merge Sort
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Quick-Sort
Quick-sort is a randomized
sorting algorithm based
x
on the divide-and-
conquer paradigm:
– Divide: pick a random
element x (called pivot) and
partition S into x
• L elements less than x
• E elements equal x L E G
• G elements greater than x
– Recur: sort L and G
– Conquer: join L, E and G x
25
Quick-Sort
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Quicksort
Another divide-and-conquer algorithm
– The array A[p..r] is partitioned into two non-empty subarrays A[p..q] and A[q+1..r]
7 4 9 6 2 → 2 4 6 7 9
4 2 → 2 4 7 9 → 7 9
2→2 9→9
29
Quick-Sort
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Execution Example
Pivot selection
7 2 9 43 7 6 1 → 1 2 3 4 6 7 8 9
7 2 9 4 → 2 4 7 9 3 8 6 1 → 1 3 8 6
9→9 4→4
30
Quick-Sort
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Execution Example (cont.)
Partition, recursive call, pivot selection
7 2 9 4 3 7 6 1→ 1 2 3 4 6 7 8 9
2 4 3 1→ 2 4 7 9 3 8 6 1 → 1 3 8 6
9→9 4→4
31
Quick-Sort
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Execution Example (cont.)
Partition, recursive call, base case
7 2 9 43 7 6 1→→ 1 2 3 4 6 7 8 9
2 4 3 1 →→ 2 4 7 3 8 6 1 → 1 3 8 6
9→9 4→4
32
Quick-Sort
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Execution Example (cont.)
Recursive call, …, base case, join
7 2 9 43 7 6 1→ 1 2 3 4 6 7 8 9
2 4 3 1 → 1 2 3 4 3 8 6 1 → 1 3 8 6
9→9 4→4
33
Quick-Sort
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Execution Example (cont.)
Recursive call, pivot selection
7 2 9 43 7 6 1→ 1 2 3 4 6 7 8 9
2 4 3 1 → 1 2 3 4 7 9 7 1 → 1 3 8 6
9→9 4→4
34
Quick-Sort
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Execution Example (cont.)
Partition, …, recursive call, base case
7 2 9 43 7 6 1→ 1 2 3 4 6 7 8 9
2 4 3 1 → 1 2 3 4 7 9 7 1 → 1 3 8 6
9→9 4→4
35
Quick-Sort
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Execution Example (cont.)
Join, join
7 2 9 4 3 7 6 1 →1 2 3 4 6 7 7 9
2 4 3 1 → 1 2 3 4 7 9 7 → 17 7 9
9→9 4→4
36
Quick-Sort
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Quicksort Code
Quicksort(A, p, r)
{
if (p < r)
{
q = Partition(A, p, r);
Quicksort(A, p, q);
Quicksort(A, q+1, r);
}
}
David Luebke
37
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Partition
Clearly, all the action takes place in the partition()
function
– Rearranges the subarray in place
– End result:
• Two subarrays
• All values in first subarray all values in second
– Returns the index of the “pivot” element separating the two subarrays
David Luebke
38
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Partition Code
Partition(A, p, r)
x = A[p];
i = p - 1; Illustrate on
j = r + 1; A = {5, 3, 2, 6, 4, 1, 3, 7};
while (TRUE)
repeat
j--;
until A[j] <= x;
repeat
i++; What is the running time of
until A[i] >= x; partition()?
if (i < j)
Swap(A, i, j);
else
return j;
3 2 5 1 0 7 3 5 9 2 7 9 8 9 7 6 9 (pivot = 6)
3 2 5 1 0 7 3 5 9 2 7 9 8 9 7 6 9
40
Quick-Sort
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Worst-case Running Time
The worst case for quick-sort occurs when the pivot is the unique
minimum or maximum element
One of L and G has size n − 1 and the other has size 0
The running time is proportional to the sum
n + (n − 1) + … + 2 +
Thus, the worst-case running time of quick-sort is O(n2)
depth time
0 n
1 n−1
… …
n−1 1 41
Quick-Sort
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Expected Running Time
Consider a recursive call of quick-sort on a sequence of size s
– Good call: the sizes of L and G are each less than 3s4
– Bad call: one of L and G has size greater than 3s4
7 2 9 43 7 6 1 9 7 2 9 43 7 6 1
2 4 3 1 7 9 7 1 → 1 1 7 2 9 43 7 6
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
Works out to
T(n) = (n2)
David Luebke
48
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Integer Multiplication
So, T(n) = 3T(n/2) + n, which implies T(n) is O(nlog23), by the Master Theorem.
Thus, T(n) is O(n1.585). 49
50
1
BITS Pilani
Pilani|Dubai|Goa|Hyderabad
2
Example 3:Merge sort
MERGE-SORT A[1 . . n]
1. If n = 1, done.
2. Recursively sort A[ 1 . . n/2 ]
and A[ n/2+1 . . n ] .
3. “Merge” the 2 sorted lists.
L1.3
Merging two sorted arrays
20 12
13 11
7 9
2 1
L1.4
Merging two sorted arrays
20 12
13 11
7 9
2 1
L1.5
Merging two sorted arrays
20 12 20 12
13 11 13 11
7 9 7 9
2 1 2
L1.6
Merging two sorted arrays
20 12 20 12
13 11 13 11
7 9 7 9
2 1 2
1 2
L1.7
Merging two sorted arrays
20 12 20 12 20 12
13 11 13 11 13 11
7 9 7 9 7 9
2 1 2
1 2
L1.8
Merging two sorted arrays
20 12 20 12 20 12
13 11 13 11 13 11
7 9 7 9 7 9
2 1 2
1 2 7
L1.9
Merging two sorted arrays
20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2
1 2 7
L1.10
Merging two sorted arrays
20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2
1 2 7 9
L1.11
Merging two sorted arrays
20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2
1 2 7 9
L1.12
Merging two sorted arrays
20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2
1 2 7 9 11
L1.13
Merging two sorted arrays
20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2
1 2 7 9 11
L1.14
Merging two sorted arrays
20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2
1 2 7 9 11 12
L1.15
Merging two sorted arrays
20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2
1 2 7 9 11 12
L1.16
Analyzing merge sort
T(n)
(1) MERGE-SORT A[1 . . n]
2T(n/2)
1. If n = 1, done.
(n)
2. Recursively sort A[ 1 . . n/2 ]
and A[ n/2+1 . . n ] .
3. “Merge” the 2 sorted lists
Should be T( n/2 ) + T( n/2 ) , but it turns out not to matter asymptotically.
L1.17
Recurrence for merge sort
(1) if n = 1;
T(n) =
2T(n/2) + (n) if n > 1.
L1.18
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
L1.19
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
T(n)
L1.20
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
T(n/2) T(n/2)
L1.21
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/2 cn/2
L1.22
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/2 cn/2
(1)
L1.23
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/2 cn/2
h = lg n
cn/4 cn/4 cn/4 cn/4
(1)
L1.24
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2
h = lg n
cn/4 cn/4 cn/4 cn/4
(1)
L1.25
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2 cn
h = lg n
cn/4 cn/4 cn/4 cn/4
(1)
L1.26
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2 cn
h = lg n
cn/4 cn/4 cn/4 cn/4 cn
…
(1)
L1.27
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2 cn
h = lg n
cn/4 cn/4 cn/4 cn/4 cn
…
(1) #leaves = n (n)
L1.28
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2 cn
h = lg n
cn/4 cn/4 cn/4 cn/4 cn
…
(1) #leaves = n (n)
Total = (n lg n)
L1.29
Conclusions
L1.30
Recursive Algorithms
Divide
Recur
Conquer
31
Recursive Functions
void Test(int n)
{
if (n>0)
{ printf (“%d”,n);
Test (n-1);
}
}
𝑇 𝑛 − 1 + 1, 𝑖𝑓 𝑛 > 0,
T(n) = ቊ
1, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
32
33
𝑇 𝑛 − 1 + 1, 𝑖𝑓 𝑛 > 0,
T(n) = ቊ O(n)
1, 𝑛=0
𝑇 𝑛 − 1 + 𝑛, 𝑖𝑓 𝑛 > 0,
T(n)= ቊ O(n2)
1, 𝑛=0
34
𝑇 𝑛/2 + 1, 𝑖𝑓 𝑛 > 1,
T(n)= ቊ O(logn)
1, 𝑛=1
𝑇 𝑛/2 + 𝑛, 𝑖𝑓 𝑛 > 1,
T(n)= ቊ
1, 𝑛=1
35
Solve by
1) Recursion Tree
2) Substitution Method
3) Master Theorem
36
37
1 2 n2 bn
i 2i n2i bn
… … … …
Total time = bn + bn log n
(last level plus all previous levels)
38
Divide-and-Conquer
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Dividing functions
𝑇 𝑛/2 + 1, 𝑖𝑓 𝑛 > 1,
T(n)= ቊ O(logn)
1, 𝑛=1
𝑇 𝑛/2 + 𝑛, 𝑖𝑓 𝑛 > 1,
T(n)= ቊ
1, 𝑛=1
39
40
42
43
Divide-and-Conquer
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Master Method, Example 2
The form: c if n d
T (n) =
aT (n / b) + f (n ) if n d
The Master Theorem:
1. if f (n) is O (n logb a − ), then T (n) is (n logb a )
2. if f (n) is (n logb a log k n), then T (n) is (n logb a log k +1 n)
3. if f (n) is (n logb a + ), then T (n) is ( f (n)),
provided af (n / b) f (n) for some 1.
Example:
T (n) = 2T (n / 2) + n log n
Solution: logba=1, so case 2 says T(n) is 𝜃(n log2 n).
44
Divide-and-Conquer
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Master Method, Example 3
The form: c if n d
T (n) =
aT (n / b) + f (n ) if n d
The Master Theorem:
1. if f (n) is O (n logb a − ), then T (n) is (n logb a )
2. if f (n) is (n logb a log k n), then T (n) is (n logb a log k +1 n)
3. if f (n) is (n logb a + ), then T (n) is ( f (n)),
provided af (n / b) f (n) for some 1.
Example:
T (n) = T (n / 3) + n log n
Solution: logba=0, so case 3 says T(n) is 𝜃(n log n).
45
Divide-and-Conquer
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Master Method, Example 4
The form: c if n d
T (n) =
aT (n / b) + f (n ) if n d
The Master Theorem:
1. if f (n) is O (n logb a − ), then T (n) is (n logb a )
2. if f (n) is (n logb a log k n), then T (n) is (n logb a log k +1 n)
3. if f (n) is (n logb a + ), then T (n) is ( f (n)),
provided af (n / b) f (n) for some 1.
Example:
T (n) = 8T (n / 2) + n 2
46
Divide-and-Conquer
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Master Method, Example 5
The form: c if n d
T (n) =
aT (n / b) + f (n ) if n d
The Master Theorem:
1. if f (n) is O (n logb a − ), then T (n) is (n logb a )
2. if f (n) is (n logb a log k n), then T (n) is (n logb a log k +1 n)
3. if f (n) is (n logb a + ), then T (n) is ( f (n)),
provided af (n / b) f (n) for some 1.
Example:
T (n) = 9T (n / 3) + n 3
47
Divide-and-Conquer
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Master Method, Example 6
The form: c if n d
T (n) =
aT (n / b) + f (n ) if n d
The Master Theorem:
1. if f (n) is O (n logb a − ), then T (n) is (n logb a )
2. if f (n) is (n logb a log k n), then T (n) is (n logb a log k +1 n)
3. if f (n) is (n logb a + ), then T (n) is ( f (n)),
provided af (n / b) f (n) for some 1.
Example:
T (n) = T (n / 2) + 1 (binary search)
48
Divide-and-Conquer
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
exercise[Ex:-1]
50
51
1
BITS Pilani
Pilani|Dubai|Goa|Hyderabad
2
Complexity Classes
A complexity class is the set of all of the computational problems which
can be solved using a certain amount of a certain computational
resource.
P
The complexity class P is the set of decision problems that can be
solved by a deterministic machine in polynomial time.
This class corresponds to an intuitive idea of the problems which can
be effectively solved in the worst cases.
NP
The complexity class NP is the set of decision problems that can be
solved by a nondeterministic machine in polynomial time.
This class contains many problems that people would like to be able to
solve effectively.
All the problems in this class have the property that their solutions can
be checked effectively.
Decision Problem
There are many problems for which the answer is a Yes or a No. These
types of problems are known as decision problems. For example,
All the problems in this class have the property that their solutions can
be checked effectively.
This class contains many problems that people would like to be able to
solve effectively, including
the Boolean satisfiability problem (SAT)
the Hamiltonian path problem (special case of TSP)
the Vertex cover problem.
The problem belongs to class P if it’s easy to find a solution for the
problem. The problem belongs to NP, if it’s easy to check a solution that
may have been very tedious to find.
10
Definition of NP-Completeness
A language B is NP-complete if it satisfies two conditions
B is in NP
Every A in NP is polynomial time reducible to B.
11
NP-hard
If a polynomial time algorithm exists for any of these
problems, all problems in NP would be polynomial time
solvable. These problems are called NP-complete.
12
13
15
16
17
18
19
20
21
22
23
24
25
26
30
31
Monte-Carlo algorithms
❖ The Monte-Carlo algorithms work in a fixed running time;
however, it does not guarantee correct results. One way to
control its runtime is by limiting the number of iterations.
36
37
1
BITS Pilani
Pilani|Dubai|Goa|Hyderabad
2
Branch and Bound
10
11
We’re using the Add() function in the pseudocode, which calculates the
cost of a particular node and adds it to the list of active nodes.
In the search space tree, each node contains some information, such
as cost, a total number of jobs, as well as a total number of workers.
12
13
15
16
https://towardsdatascience.com/the-branch-and-bound-
algorithm-a7ae4d227a69
18
19
20
21
23
24
25
1
Sum of Subsets
Subset sum problem is the problem of finding a subset such that the sum
of elements equal a given number.
Example:
{ 2, 9, 10, 1, 99, 3}
{ 1, 3 }
2
Input:
This algorithm takes a set of numbers, and a sum value.
The Set: {10, 7, 5, 18, 12, 20, 15}
The sum Value: 35
Output:
All possible subsets of the given set, where sum of each element
for every subsets is same as the given sum value.
{10, 7, 18}
{10, 5, 20}
{5, 18, 12}
{20, 15}
Let, S = {S1 …. Sn} be a set of n positive integers, then we have to find a subset whose sum is equal to given
positive integer d.It is always convenient to sort the set’s elements in ascending order. That is, S1 ≤ S2
≤…. ≤ Sn
Algorithm:
If the subset is having sum m then stop with that subset as solution.
If the subset is not feasible or if we have reached the end of the set then backtrack through the subset until
we find the most suitable value.
If we have visited all the elements without finding a suitable subset and if no backtracking is possible then
stop without solution.
Example: Solve following problem and draw portion of state space tree M=30,W ={5, 10, 12, 13, 15, 18}
Solution:
10
12
13
Input:
A graph represented in 2D array format of size V * V where V is
the number of vertices in graph and the 2D array is the
adjacency matrix representation and value graph[i][j] is 1 if
there is a direct edge from i to j, otherwise the value is 0.
An integer m that denotes the maximum number of colors which
can be used in graph coloring
Output:
Return array color of size V that has numbers from 1 to m. Note
that color[i] represents the color assigned to the ith vertex.
Return false if the graph cannot be colored with m colors.
14
Solution:
Naive Approach:
The brute force approach would be to generate all possible
combinations (or configurations) of colors.
After generating a configuration, check if the adjacent
vertices have the same colour or not. If the conditions
are met, add the combination to the result and break the
loop.
Since each node can be colored by using any of the m
colors, the total number of possible color configurations
are mV. The complexity is exponential which is very
huge.
15
Using Backtracking:
By using the backtracking method, the main idea is to
assign colors one by one to different vertices right from
the first vertex (vertex 0).
Before color assignment, check if the adjacent vertices
have same or different color by considering already
assigned colors to the adjacent vertices.
If the color assignment does not violate any constraints,
then we mark that color as part of the result. If color
assignment is not possible then backtrack and return
false.
16
22
23
24
25
26
27
Naive Algorithm
Generate all possible configurations of vertices and print a
configuration that satisfies the given constraints. There
will be n! (n factorial) configurations.
28
31
32
33
34
35
36
1
Backtracking
It is a brute force approach that tries out all the possible solutions.
The term backtracking implies - if the current solution is not suitable, then eliminate
that and backtrack (go back) and check for other solutions.
3
Image Source Ref: InterviewBit.com
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Backtracking
Backtracking can understand of as searching a tree for a particular "goal" leaf node.
To "explore" node N:
1. If N is a goal node, return "success"
2. If N is a leaf node, return "failure"
3. For each child C of N,
Explore C
If C was successful, return "success"
4. Return "failure“
12
The second queen should not be in first row and second column. It should be placed
in second row and in second, third or fourth column. It we place in second column,
both will be in same diagonal, so place it in third column.
Q Q
- - - - Q2
- - - -
13
Q1 Q1
Q2 - - Q2
- Q3 - -
14
Now the fourth queen should be placed in 4th row and 3rd
column but there will be a diagonal attack from queen 3.
So go back, remove queen 3 and place it in the next
column. But it is not possible, so move back to queen 2
and remove it to next column but it is not possible. So go
back to queen 1 and move it to next column.
Q1 1
- - Q2 - - Q2
- Q3 - - 3 - -
4
15
Step 2: Given n queens, read n from user and let us denote the queen number by k. k=1,2,..,n.
Step 3: We start a loop for checking if the k<sup>th</sup> queen can be placed in the
respective column of the k<sup>th</sup> row.
Step 4: For checking that whether the queen can be placed or not, we check if the previous
queens are not in diagonal or in same row with it.
Step 5: If the queen cannot be placed backtracking is done to the previous queens until a
feasible solution is not found.
Step 6: Repeat the steps 3-5 until all the queens are placed.
Step 7: The column numbers of the queens are stored in an array and printed as a n-tuple
solution
Step 8: Stop
16
17
18
1
Single-source shortest paths
• Two classic algorithms to solve single-source shortest path
problem
– Bellman-Ford algorithm
• A dynamic programming algorithm
• Works when some weights are negative
– Dijkstra’s algorithm
• A greedy algorithm
• Faster than Bellman-Ford
• Works when weights are all non-negative
Bellman-Ford algorithm
Observation:
• If there is a negative cycle, there is no solution
– Add this cycle again can always produces a less
weight path
• If there is no negative cycle, a shortest path has at most |V|-1 edges
Idea:
• Solve it using dynamic programming
• For all the paths have at most 0 edge, find all the shortest paths
• For all the paths have at most 1 edge, find all the shortest paths
• …
• For all the paths have at most |V|-1 edge, find all the shortest paths
Bellman-Ford algorithm
Bellman-Ford(G, s)
//Initialize 0-edge shortest paths
for each v in G.V{
if(v==s) 𝑑𝑠,𝑣 =0; else 𝑑𝑠,𝑣 = ∞; //set the 0-edge shortest distance
from s to v
𝜋𝑠,𝑣 = NIL; //set the predecessor of v on the shortest path
}
//bottom-up construct 0-to-(|V|-1)-edges shortest paths
Repeat |G.V|-1 times {
for each edge (u, v) in G.E{
if(𝑑𝑠,𝑣 > 𝑑𝑠,𝑢 + 𝑤(𝑢,𝑣) ){
𝑑𝑠,𝑣 = 𝑑𝑠,𝑢 + 𝑤(𝑢,𝑣) ;
𝜋𝑠,𝑣 = 𝑢;
}
//test negative cycle
}
for each edge (u, v) in G.E{
If (𝑑𝑠,𝑣 > 𝑑𝑠,𝑢 + 𝑤(𝑢,𝑣) ) return false; // there is no solution
}
return true;
T(n)=O(VE)=O(𝑉 3 )
Page 5
Page 6
Bellman-Ford algorithm
e.g.
20
0 ∞ ∞
10 1
1 2 3
5
∞ ∞
-2
2 3
6
-3
0
8
1 7
-4
7 2
4 5
∞ ∞
9
Bellman-Ford algorithm
Calculate all at most 1 edge
shortest paths 5
∞ /6 ∞ /∞
-2
2 3
6
-3
0 /0
8
1 7
-4
7 2
4 5
∞ /∞
/7 ∞ /∞
9
Bellman-Ford algorithm
Calculate all at most 2 edges
shortest paths 5
6 /6 ∞ /11
/4
-2
2 3
6
-3
0 /0
8
1 7
-4
7 2
4 5
7 /7 ∞ /2
9
Bellman-Ford algorithm
Calculate all at most 3 edges
shortest paths 5
/2
6 /6 4 /4
-2
2 3
6
-3
0 /0
8
1 7
-4
7 2
4 5
7 /7 2 /2
9
Bellman-Ford algorithm
Calculate all at most 4 edges
shortest paths 5
2 /2 4 /4
-2
2 3
6
-3
0 /0
8
1 7
-4
7 2
4 5
7 /7 2 /-2
9
Bellman-Ford algorithm
Final result:
5
2 4
-2
2 3
6
-3
0
8
1 7
-4
7 2
4 5
7 -2
9
What is the shortest path 1, 4, 3, 2, 5
from 1 to 5?
What is the shortest path
What is weight of this path? -2 from 1 to 2, 3, and 4?
Transitive closure:
16
17
18
Rn – Transitive closure.
Vertices are numbered from 1 to ‘n’.
20
21
1 2
1 2 3 4
1 0 1 0 0
R0 = 2 0 0 1 0
3 1 0 0 1
4 3 4 0 0 0 0
22
1 2 3 4
1 0 1 0 0
R(1) = 2 0 0 1 0
3 1 1 0 1
4 0 0 0 0
23
1 2 3 4
R(2) = 1 0 1 1 0
2 0 0 1 0
3 1 1 1 1
4 0 0 0 0
24
1 2 3 4
R(3) = 1 1 1 1 1
2 1 1 1 1
3 1 1 1 1
4 0 0 0 0
25
1 2 3 4
R(4) = 1 1 1 1 1
2 1 1 1 1
3 1 1 1 1
4 0 0 0 0
26
R(0) ← A
for K ← 1 to n do
for i ← 1 to n do
for j ← 1 to n do
R(k)[i,j] ← R(k-1)[i,j] or (R(k-1)[i,k] and R(k-1)[k,j]
return R(n)
27
4 5
nap more c.s.
7
play
8
write c.s. program 6
9 work out
make cookies
for professors
10
sleep 11
dream about graphs 29
Directed Graphs
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Topological Sort Algorithm
31
Directed Graphs
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Topological Sorting Example
1
4
2
5
3 6
7
8
9
32
Directed Graphs
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
33
36
37
40
41
Problem statement:
• You are given a rod of length n and you need to cut the
cod in such a way that you need to sell It for maximum
profit. You are also given a price table where it gives,
what a piece of rod is worth.
42
43
44
45
46
47
48
49
50
51
1
BITS Pilani
Pilani|Dubai|Goa|Hyderabad
Solutions:
1. Euclid’s algorithm
2. Consecutive integer checking algorithm
3. Middle-school procedure
10
11
12
14
15
CHAPTER 1
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Measurements
Performance Analysis (machine independent)
– space complexity: storage requirement
– time complexity: computing time
Performance Measurement (machine dependent)
16
17
18
19
20
21
22
23
24
25
CHAPTER 1
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Examples
27
CHAPTER 1
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Time Complexity
T(P)=C+TP(I)
Compile time (C) TP(n)=caADD(n)+csSUB(n)+clLDA(n)+cstSTA(n)
independent of instance characteristics
run (execution) time TP
Definition
A program step is a syntactically or semantically
meaningful program segment whose execution time
is independent of the instance characteristics.
Example
– abc = a + b + b * c + (a + b - c) / (a + b) + 4.0
– abc = a + b + c
Regard as the same unit
machine independent
28
29
CHAPTER 1
BITS Pilani, Deemed to be University under Section 3 of UGC Act, 1956
Iterative summing of a list of numbers
31
Recursive Function to sum of a list of numbers
*Figure 1.3: Step count table for recursive summing function (p.27)
32
Matrix Addition
*Figure 1.4: Step count table for matrix addition (p.27)
Total 2rows‧cols+2rows+1
33
Why study algorithms and
performance?
•Algorithms help us to understand scalability.
Performance often draws the line between what is feasible and what is
impossible.
Speed is fun!
34
35
36
➢ A CPU
37
38
39
asymptotic notation
O(n)
Ω(n)
Ө(n)
42
Example
O(n2) = {2n2 + 10n + 2000, 125n2 – 233n - 250, 10n + 25, 150, …..}
43
Example:
Input: 8 2 4 9 3 6
Output: 2 3 4 6 8 9
L1.46
key
sorted
L1.47
L1.48
L1.49
2 8 4 9 3 6
L1.50
2 8 4 9 3 6
L1.51
2 8 4 9 3 6
2 4 8 9 3 6
L1.52
2 8 4 9 3 6
2 4 8 9 3 6
L1.53
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
L1.54
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
L1.55
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
L1.56
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
L1.57
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
2 3 4 6 8 9 done
L1.58
BIG IDEAS:
“Asymptotic Analysis”
L1.60
DEF:
L1.61
CHAPTER 1 63
Example
CHAPTER 1 64
• O(1): constant
• O(n): linear
• O(n2): quadratic
• O(n3): cubic
• O(2n): exponential
• O(logn)
• O(nlogn)
CHAPTER 1 65
*Figure 1.7:Function values (p.38)
CHAPTER 1 66
*Figure 1.8:Plot of function values(p.39)
nlogn
logn
CHAPTER 1 67
*Figure 1.9:Times on a 1 billion instruction per second computer(p.40)
CHAPTER 1 68