0% found this document useful (0 votes)
2 views

Algorithm and Complexity

Chapter 3 discusses various algorithm design strategies including brute-force, greedy algorithms, divide and conquer, backtracking, and branch-and-bound. It also covers graph traversal techniques such as Depth-First Search (DFS) and Breadth-First Search (BFS), detailing their algorithms and applications. Examples illustrate how to perform DFS and BFS on graphs, emphasizing their utility in finding paths, detecting cycles, and organizing tasks.

Uploaded by

acharyabibash1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Algorithm and Complexity

Chapter 3 discusses various algorithm design strategies including brute-force, greedy algorithms, divide and conquer, backtracking, and branch-and-bound. It also covers graph traversal techniques such as Depth-First Search (DFS) and Breadth-First Search (BFS), detailing their algorithms and applications. Examples illustrate how to perform DFS and BFS on graphs, emphasizing their utility in finding paths, detecting cycles, and organizing tasks.

Uploaded by

acharyabibash1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 156

Chapter 3:

Algorithm Design Strategies


(Part II)
● Brute-force algorithms
● Greedy algorithms
○ Action-selection problem
○ Huffman coding
○ Minimum spanning tree algorithms
- Kruskal’s, Prim’s
○ Shortest path algorithm - Dijkstra’s

Contents ○ Flow networks - Ford Fulkerson


algorithm
● Divide and Conquer
● Backtracking
○ N-queen problem
● Branch-and-bound
○ 0/1 Knapsack problem
Traversal techniques
Graph traversal
Process of visiting each vertex in a graph

Given a graph, G = (V, E), and a vertex, v ∈ V(G), visit all vertices in G that
are reachable from v

2 ways of doing this:

1. Depth-first search (DFS)


2. Breadth-first search (BFS)
Depth-first search (DFS)
Process all descendants of a vertex before we move to an adjacent vertex.

DFS in a graph is similar to DFS in a tree. Since graphs may contain cycles
unlike trees, we may come to the same node again. To avoid processing a
node more than once, we keep track of visited nodes.

Uses a stack data structure to perform the search.


Depth-first search (DFS)
Basic idea:

1. Start by putting any one of the graph's vertices (starting vertex) on top of a
stack.
2. Pop the topmost item from the stack and add it to the visited list.
3. Push the popped vertex’s unvisited neighbors into the top of stack.
4. Keep repeating steps 2 and 3 until the stack is empty.
DFS

https://xkcd.com/761/
Depth-first search (DFS)
Algorithm: (Recursive) DFS(G, s)
Input: A graph, G, and a starting vertex, s
Output: A sequence of processed vertices

Steps:

1. mark(s); // Mark s as visited


2. ∀(s, v) ∈ E(G)
a. DFS (G, v)
Depth-first search (DFS)
Algorithm: (Iterative) DFS(G, s)
3. while L ≠ ∅ do
Input: A graph, G, and a starting vertex, s
a. u ≔ last(L) // Top of the stack
Output: A sequence of processed vertices
b. if ∃(u, v) such that v is unmarked
Steps: then // Find neighbors of u
i. choose v of the smallest
1. mark(s); // Mark s as visited index;
2. L ≔ {s} // Push s into the stack ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Pop from the stack
d. endif
4. endwhile
Depth-first search (DFS)
Example: Perform DFS on the following graph starting from A

B Stack Visited vertices


D

E
C
Depth-first search (DFS)
Example: Perform DFS on the following graph starting from A

B Stack Visited vertices


D
A

E
C

1. mark(s); // Mark s as visited


2. L ≔ {s} // Push s into the stack
Depth-first search (DFS)
Example: Perform DFS on the following graph starting from A

B Stack Visited vertices


D
A B

A
3. while L ≠ ∅ do
E a. u ≔ last(L) // Top of the stack
C b. if ∃(u, v) such that v is unmarked then
// Find neighbors of u
B i. choose v of the smallest index;
A ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Pop from the stack
d. endif
4. endwhile
Depth-first search (DFS)
Example: Perform DFS on the following graph starting from A

B Stack Visited vertices


D
A B C

A
3. while L ≠ ∅ do
E a. u ≔ last(L) // Top of the stack
C b. if ∃(u, v) such that v is unmarked then
C // Find neighbors of u
B i. choose v of the smallest index;
A ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Pop from the stack
d. endif
4. endwhile
Depth-first search (DFS)
Example: Perform DFS on the following graph starting from A

B Stack Visited vertices


D
A B C E

A
3. while L ≠ ∅ do
E a. u ≔ last(L) // Top of the stack
C
E b. if ∃(u, v) such that v is unmarked then
C // Find neighbors of u
B i. choose v of the smallest index;
A ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Pop from the stack
d. endif
4. endwhile
Depth-first search (DFS)
Example: Perform DFS on the following graph starting from A

B Stack Visited vertices


D
A B C E

A
3. while L ≠ ∅ do
E a. u ≔ last(L) // Top of the stack
C b. if ∃(u, v) such that v is unmarked then
C // Find neighbors of u
B i. choose v of the smallest index;
A ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Pop from the stack
d. endif
4. endwhile
Depth-first search (DFS)
Example: Perform DFS on the following graph starting from A

B Stack Visited vertices


D
A B C E

A
3. while L ≠ ∅ do
E a. u ≔ last(L) // Top of the stack
C b. if ∃(u, v) such that v is unmarked then
// Find neighbors of u
B i. choose v of the smallest index;
A ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Pop from the stack
d. endif
4. endwhile
Depth-first search (DFS)
Example: Perform DFS on the following graph starting from A

B Stack Visited vertices


D
A B C E D

A
3. while L ≠ ∅ do
E a. u ≔ last(L) // Top of the stack
C b. if ∃(u, v) such that v is unmarked then
D // Find neighbors of u
B i. choose v of the smallest index;
A ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Pop from the stack
d. endif
4. endwhile
Depth-first search (DFS)
Example: Perform DFS on the following graph starting from A

B Stack Visited vertices


D
A B C E D

A
3. while L ≠ ∅ do
E a. u ≔ last(L) // Top of the stack
C b. if ∃(u, v) such that v is unmarked then
// Find neighbors of u
B i. choose v of the smallest index;
A ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Pop from the stack
d. endif
4. endwhile
Depth-first search (DFS)
Example: Perform DFS on the following graph starting from A

B Stack Visited vertices


D
A B C E D

A
3. while L ≠ ∅ do
E a. u ≔ last(L) // Top of the stack
C b. if ∃(u, v) such that v is unmarked then
// Find neighbors of u
i. choose v of the smallest index;
A ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Pop from the stack
d. endif
4. endwhile
Depth-first search (DFS)
Example: Perform DFS on the following graph starting from A

B Stack Visited vertices


D
A B C E D

A
3. while L ≠ ∅ do
E a. u ≔ last(L) // Top of the stack
C b. if ∃(u, v) such that v is unmarked then
// Find neighbors of u
i. choose v of the smallest index;
ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Pop from the stack
d. endif
4. endwhile
Applications of DFS
● Finding a minimum spanning tree for unweighted graphs
● Detecting a cycle in the graph
● Finding a path from one node to another
● Topological ordering: determining the order of compilation tasks, resolving
symbol dependencies in linkers etc.
● Solving problems with only one solution, such as maze
● etc.
Breadth-first search (BFS)
Process all adjacent vertices of a vertex before going to the next level.

Uses a queue data structure to perform the search.


Breadth-first search (BFS)
Basic idea:

1. Start by putting any one of the graph's vertices (starting vertex) at the back of a
queue.
2. Dequeue the queue (take the vertex at the front of the queue) and add it to
the visited list.
3. Enqueue the dequeued vertex’s unvisited neighbours to the back of the
queue.
4. Keep repeating steps 2 and 3 until the queue is empty.
Breadth-first search (BFS)
Algorithm: BFS(G, s)
3. while L ≠ ∅ do
Input: A graph, G, and a starting vertex, s
a. u ≔ first(L) // Front of the queue
Output: A sequence of processed vertices
b. if ∃(u, v) such that v is unmarked
Steps: then // Find neighbors of u
i. choose v of the smallest
1. mark(s); // Mark s as visited index;
2. L ≔ {s} // Push s into the queue ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Dequeue the queue
d. endif
4. endwhile
Breadth-first search (BFS)
Example: Perform BFS on the following graph starting from A

B Visited vertices
D
A

A
List
E
C A

1. mark(s); // Mark s as visited


2. L ≔ {s} // Push s into the queue
Breadth-first search (BFS)
Example: Perform BFS on the following graph starting from A

B Visited vertices
D
A B

A
3. while L ≠ ∅ do
List a. u ≔ first(L) // Front of the queue
E
b. if ∃(u, v) such that v is unmarked
C A B then // Find neighbors of u
i. choose v of the smallest index;
ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Dequeue the
queue
d. endif
4. endwhile
Breadth-first search (BFS)
Example: Perform BFS on the following graph starting from A

B Visited vertices
D
A B C

A
3. while L ≠ ∅ do
List a. u ≔ first(L) // Front of the queue
E
b. if ∃(u, v) such that v is unmarked
C A B C then // Find neighbors of u
i. choose v of the smallest index;
ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Dequeue the
queue
d. endif
4. endwhile
Breadth-first search (BFS)
Example: Perform BFS on the following graph starting from A

B Visited vertices
D
A B C

A
3. while L ≠ ∅ do
List a. u ≔ first(L) // Front of the queue
E
b. if ∃(u, v) such that v is unmarked
C B C then // Find neighbors of u
i. choose v of the smallest index;
ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Dequeue the
queue
d. endif
4. endwhile
Breadth-first search (BFS)
Example: Perform BFS on the following graph starting from A

B Visited vertices
D
A B C D

A
3. while L ≠ ∅ do
List a. u ≔ first(L) // Front of the queue
E
b. if ∃(u, v) such that v is unmarked
C B C D then // Find neighbors of u
i. choose v of the smallest index;
ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Dequeue the
queue
d. endif
4. endwhile
Breadth-first search (BFS)
Example: Perform BFS on the following graph starting from A

B Visited vertices
D
A B C D

A
3. while L ≠ ∅ do
List a. u ≔ first(L) // Front of the queue
E
b. if ∃(u, v) such that v is unmarked
C C D then // Find neighbors of u
i. choose v of the smallest index;
ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Dequeue the
queue
d. endif
4. endwhile
Breadth-first search (BFS)
Example: Perform BFS on the following graph starting from A

B Visited vertices
D
A B C D E

A
3. while L ≠ ∅ do
List a. u ≔ first(L) // Front of the queue
E
b. if ∃(u, v) such that v is unmarked
C C D E then // Find neighbors of u
i. choose v of the smallest index;
ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Dequeue the
queue
d. endif
4. endwhile
Breadth-first search (BFS)
Example: Perform BFS on the following graph starting from A

B Visited vertices
D
A B C D E

A
3. while L ≠ ∅ do
List a. u ≔ first(L) // Front of the queue
E
b. if ∃(u, v) such that v is unmarked
C D E then // Find neighbors of u
i. choose v of the smallest index;
ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Dequeue the
queue
d. endif
4. endwhile
Breadth-first search (BFS)
Example: Perform BFS on the following graph starting from A

B Visited vertices
D
A B C D E

A
3. while L ≠ ∅ do
List a. u ≔ first(L) // Front of the queue
E
b. if ∃(u, v) such that v is unmarked
C E then // Find neighbors of u
i. choose v of the smallest index;
ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Dequeue the
queue
d. endif
4. endwhile
Breadth-first search (BFS)
Example: Perform BFS on the following graph starting from A

B Visited vertices
D
A B C D E

A
3. while L ≠ ∅ do
List a. u ≔ first(L) // Front of the queue
E
b. if ∃(u, v) such that v is unmarked
C then // Find neighbors of u
i. choose v of the smallest index;
ii. mark(v); L ≔ L ∪ {v}
c. else
i. L ≔ L∖ {u} // Dequeue the
queue
d. endif
4. endwhile
Applications of BFS
● Finding a minimum spanning tree for unweighted graphs
● Web crawler: Begin from a starting page and follow all links from this page and
keep doing the same
● Social networks: Find people within a given distance ‘k’ from a person
● Finding the shortest path to another node
● GPS navigation systems: Finding the direction to reach from one place to
another
● etc.
Minimum spanning tree
Spanning tree
B
A spanning tree of a connected graph G is a tree that Graph, G D
consists solely of edges in G and that includes all of
the vertices in G. A

E
Our solution to generate a spanning tree must satisfy C
the following constraints:

1. We must use only edges within the graph Spanning B

2. We must use exactly n-1 edges tree of G D

3. We may not use edges that would produce a


A
cycle
E
C
Spanning tree
A single graph can have many spanning trees.
B D B D B D

A C A C A C
B D

B D B D B D

A C

Graph G A C A C A C

Spanning trees of G
Spanning tree
A spanning tree can be generated using a DFS or a BFS. The spanning tree is
formed from those edges traversed during the search.
● If a breadth first search is used, the resulting spanning tree is called a breadth
first spanning tree.
● If a depth first search is used, it is called depth first spanning tree.

For a disconnected / disjoint graph, a spanning forest is defined.


Minimum spanning tree (MST)
A minimum spanning tree of a weighted graph is a spanning tree of least weight,
i.e. a spanning tree in which the total weight of the edges is guaranteed to be the
minimum of all possible trees in the graph.

If the weights in the network/graph are unique, there is only one MST

If there are duplicate weights, there may be one or more MSTs

Application: Network design, Muddy city problem


Muddy city problem
● A city with no paved road
● The mayor of the city decided to
pave some of the streets with
the following two conditions:
1. Enough streets must be paved so
that it is possible for everyone to
travel from their house to anyone
else’s house only along paved
roads, and
2. The paving should cost as little as
possible.
Muddy city problem
Solution: Minimum spanning tree
Growing a MST
Problem:
Given a connected, undirected graph G = (V, E) with a weight function w:E → R, find
a minimum spanning tree for G

A greedy strategy:
Grow the minimum spanning tree one edge at a time

● Kruskal’s algorithm
● Prim’s algorithm
Growing a MST

Kruskal’s algorithm: Finds a safe edge by finding, of all the edges that connect any
two trees in the forest, an edge of least weight.

Prim’s algorithm: Adds to the tree A a light edge that connects A to an isolated
vertex
Disjoint set
A group of sets where no item can be in more than one set.

A disjoint-set data structure maintains a collection S = {S1, S2, …, Sk} of disjoint


dynamic sets.

Example:

S= {{a}, {b}, {c,d}, {e, f, g, h}} is a disjoint-set.


Basic disjoint set operations
Make_set (x)
Creates a new set whose only member (and thus representative) is x. Since the sets
are disjoint, we require that x not already be in some other set.

Union (x, y)
Unites the dynamic sets that contain x and y, say Sx and Sy, into a new set that is
the union of these two sets.

Find_set(x)
Returns a pointer to the representative of the (unique) set containing x.
Kruskal’s algorithm (using disjoint sets)
Disjoint set operations
Make-set(x) :
Creates a new set whose only member is x

Union(x, y):
Unites the dynamic sets that contain x and y into a new set that is the union of
these two sets

Find-Set(x):
Returns a pointer to the representative of the set containing x
Example
Find a minimum spanning tree of the following graph

8 7
b c d

4 9
2
11
a i e
4 14
6
7
8 10
h g f
1 2
b 8 c 7 d
Example 4 9
2
a 11 i e
4 14
7 10
8
h g f
1 2

L1: A = {}

L2: {a} {b} {c} {d} {e} {f} {g} {h} {i}
b 8 c 7 d
Example 4 9
2
a 11 i e
4 14
L4: sort the edges of G:E into nondecreasing order by weight w 7
8 1
h g f
1 (h,g) 1 2 0
2 (g,f)
2 (i,c)
4 (a,b)
4 (c,f)
6 (i,g)
7 (h,i)
7 (c,d)
8 (b,c)
8 (a,h)
9 (d,e)
10 (e,f)
11 (b,h)
14 (d,f)
Example
w Edge Disjoint sets A
(u,v)

1 (h,g) {a} {b} {c} {d} {e} {f} {g,h} {i} {(h,g)}
2 (g,f)
2 (i,c)
4 (a,b)
4 (c,f)
6 (i,g)
7 (h,i)
7 (c,d)
8 (b,c)
8 (a,h)
9 (d,e)
10 (e,f)
11 (b,h)
14 (d,f)
w Edge Disjoint sets A
(u,v)

1 (h,g) {a} {b} {c} {d} {e} {f} {g,h} {i} {(h,g)}
2 (g,f)
2 (i,c)
4 (a,b)
4 (c,f)
6 (i,g)
7 (h,i)
7 (c,d)
8 (b,c)
8 (a,h)
9 (d,e)
10 (e,f)
11 (b,h)
14 (d,f)
w Edge Disjoint sets A
(u,v)

1 (h,g) {a} {b} {c} {d} {e} {f} {g,h} {i} {(h,g)}
2 (g,f) {a} {b} {c} {d} {e} {f,g,h} {i} {(h,g),(g,f)}
2 (i,c)
4 (a,b)
4 (c,f)
6 (i,g)
7 (h,i)
7 (c,d)
8 (b,c)
8 (a,h)
9 (d,e)
10 (e,f)
11 (b,h)
14 (d,f)
w Edge Disjoint sets A
(u,v)

1 (h,g) {a} {b} {c} {d} {e} {f} {g,h} {i} {(h,g)}
2 (g,f) {a} {b} {c} {d} {e} {f,g,h} {i} {(h,g),(g,f)}
2 (i,c) {a} {b} {c,i} {d} {e} {f,g,h} {(h,g),(g,f), (c,i)}
4 (a,b)
4 (c,f)
6 (i,g)
7 (h,i)
7 (c,d)
8 (b,c)
8 (a,h)
9 (d,e)
10 (e,f)
11 (b,h)
14 (d,f)
w Edge Disjoint sets A
(u,v)

1 (h,g) {a} {b} {c} {d} {e} {f} {g,h} {i} {(h,g)}
2 (g,f) {a} {b} {c} {d} {e} {f,g,h} {i} {(h,g),(g,f)}
2 (i,c) {a} {b} {c,i} {d} {e} {f,g,h} {(h,g),(g,f),(c,i)}
4 (a,b) {a,b} {c,i} {d} {e} {f,g,h} {(h,g),(g,f),(c,i), (a,b)}
4 (c,f) {a,b} {c,f,g,h,i} {d} {e} {(h,g),(g,f),(c,i),(a,b), (c,f)}
6 (i,g) Find-set(i) == Find-set(g)
7 (h,i)
7 (c,d)
8 (b,c)
8 (a,h)
9 (d,e)
10 (e,f)
11 (b,h)
14 (d,f)
w Edge Disjoint sets A
(u,v)

1 (h,g) {a} {b} {c} {d} {e} {f} {g,h} {i} {(h,g)}
2 (g,f) {a} {b} {c} {d} {e} {f,g,h} {i} {(h,g),(g,f)}
2 (i,c) {a} {b} {c,i} {d} {e} {f,g,h} {(h,g),(g,f),(c,i)}
4 (a,b) {a,b} {c,i} {d} {e} {f,g,h} {(h,g),(g,f),(c,i),(a,b)}
4 (c,f) {a,b} {c,f,g,h,i} {d} {e} {(h,g),(g,f),(c,i),(a,b),(c,f)}
6 (i,g) Find-set(i) == Find-set(g)
7 (h,i) Find-set(h) == Find-set(i)
7 (c,d) {a,b} {c,d,f,g,h,i} {e} {(h,g),(g,f),(c,i),(a,b),(c,f),(c,d)}
8 (b,c) {a,b,c,d,f,g,h,i} {e} {(h,g),(g,f),(c,i),(a,b),(c,f),(c,d), (b,c)
8 (a,h) Find-set(a) == Find-set(h) }
9 (d,e) {a,b,c,d,e,f,g,h,i}
10 (e,f) Find-set(e) == Find-set(f) {(h,g),(g,f),(c,i),(a,b),(c,f),(c,d),(b,c)
,(d,e)}
11 (b,h) Find-set(b) == Find-set(h)
14 (d,f) Find-set(d) == Find-set(f)
Example
The MST is the tree containing A.

A= {(h,g),(g,f),(c,i),(a,b),(c,f),(c,d),(b,c),(d,e)}

8 7
b c d

4 9
2
11
a i e
4 14
7
8 10
h g f
1 2
Analysis of Kruskal’s algorithm
Analysis of Kruskal’s algorithm
Line 1: O(1)

Line 2 - 3: O( |V| )

Line 4: Sorting: O( |E| log |E| )

Line 5: O( |E| )

Line 6 - 8: Depends on the


implementation of Find-Set and
Union operations. We assume
that they are very fast ( O(1) )
Overall complexity = O(1 + V + E log E + E)
= O (E log E)
Prim’s algorithm
Grows a single tree and adds a light edge (edge with the lowest weight) in
each iteration

Steps:

1. Start by picking any vertex to be the root of the tree.


2. While the tree does not contain all vertices in the graph, find shortest edge
leaving the tree and add it to the tree
Prim’s algorithm
Example: Find a minimum spanning tree of the following graph
A

3 6
1
B D

5 5

C
3 2
6 4

E F
6
Prim’s algorithm
Step 1: Pick any vertex to be the root of the tree. Let’s say A will be the root
A

3 6
1
B D

5 5

C
3 2
6 4

E F
6
Prim’s algorithm
Step 2: Find shortest edge leaving the tree and add it to the tree
A A

3 6 3 6
1 1
B D B D

5 5 5 5

C C
3 2 3 2
6 4 6 4

E F E F
6 6
Prim’s algorithm
Step 2: Find shortest edge leaving the tree and add it to the tree
A A

3 6 3 6
1 1
B D B D

5 5 5 5

C C
3 2 3 2
6 4 6 4

E F E F
6 6
Prim’s algorithm
Step 2: Find shortest edge leaving the tree and add it to the tree
A A

3 6 3 6
1 1
B D B D

5 5 5 5

C C
3 2 3 2
6 4 6 4

E F E F
6 6
Prim’s algorithm
Step 2: Find shortest edge leaving the tree and add it to the tree
A A

3 6 3 6
1 1
B D B D

5 5 5 5

C C
3 2 3 2
6 4 6 4

E F E F
6 6
Prim’s algorithm
Step 2: Find shortest edge leaving the tree and add it to the tree
A A

3 6 3 6
1 1
B D B D

5 5 5 5

C C
3 2 3 2
6 4 6 4

E F E F
6 6
Prim’s algorithm
So the spanning tree is
A

3
1
B D

C
3 2
4

E F
Prim’s algorithm
Example
Find a minimum spanning tree of the following graph

8 7
b c d

4 9
2
11
a i e
4 14
6
7
8 10
h g f
1 2
Example

8 7
b c d

4 9
2
11
a i e
4 14
6
7 6
8 10
h g f
1 2
Example

∞ ∞ ∞
b c d



a i ∞ e
6

h g f

∞ ∞ ∞
Example

∞ ∞ ∞
b c d

0 ∞
Q = {a, b, c, d, e, f, g, h, i}
a i ∞ e
6

h g f

∞ ∞ ∞
Example

∞ ∞ ∞
b c d
Q = {b, c, d, e, f, g, h, i}
4
0 ∞
a i ∞ e

8
h g f

∞ ∞ ∞
u=a
G.Adj[u] = {b, h}
Example

4 ∞ ∞
b c d
Q = {b, c, d, e, f, g, h, i}
4
0 ∞
a i ∞ e

8
h g f

∞ ∞ ∞
G.Adj[u] = {b, h}
b ∈ Q and w(u,v) < b.key ? True
Example

4 ∞ ∞
b c d
Q = {b, h, c, d, e, f, g, i}
4
0 ∞
a i ∞ e

8
h g f

8 ∞ ∞
G.Adj[u] = {b, h}
h ∈ Q and w(u,v) < h.key ? True
Example

4 ∞ ∞
b c d
Q = {h, c, d, e, f, g, i}
4
0 ∞
a i ∞ e

8
h g f

8 ∞ ∞
u=b
Example

4 ∞ ∞
8
b c d
Q = {h, c, d, e, f, g, i}
4
0 ∞
11
a i ∞ e

8
h g f

8 ∞ ∞
G.Adj[u] = {a, c, h}
Example

4 ∞ ∞
8
b c d
Q = {h, c, d, e, f, g, i}
4
0 ∞
11
a i ∞ e

8
h g f

8 ∞ ∞
G.Adj[u] = {a, c, h}
a ∈ Q and w(u,v) < a.key ? False
Example

4 8 ∞
8
b c d
Q = {h, c, d, e, f, g, i}
4
0 ∞
11
a i ∞ e

8
h g f

8 ∞ ∞
G.Adj[u] = {a, c, h}
c ∈ Q and w(u,v) < c.key ? True
Example

4 8 ∞
8
b c d
Q = {h, c, d, e, f, g, i}
4
0 ∞
11
a i ∞ e

8
h g f

8 ∞ ∞
G.Adj[u] = {a, c, h}
h ∈ Q and w(u,v) < h.key ? False
Example

4 8 ∞
8
b c d
Q = {c, d, e, f, g, i}
4
0 ∞
11
a i ∞ e

8
h g f

8 ∞ ∞
u=h
Example

4 8 ∞
8
b c d
Q = {c, d, e, f, g, i}
4
0 ∞
11
a i ∞ e
7
8
h g f
1
8 ∞ ∞
u=h
G.Adj[u] = {a, b, g, i}
Example

4 8 ∞
8
b c d
Q = {g, i, c, d, e, f}
4
0 ∞
11 7
a i e
7
8
h g f
1
8 1 ∞
u=h
G.Adj[u] = {a, b, g, i}
Example

4 8 ∞
8
b c d
Q = {i, c, d, e, f}
4
0 ∞
11 7
a i e
7
8
h g f
1
8 1 ∞
u=g
Example

4 8 ∞
8
b c d
Q = {c, d, e, f, i}
4
0 ∞
11 7
a i e
7
8 6
h g f
1 2
8 1 ∞
u=g
G.Adj[u] = {f, h, i}
Example

4 8 ∞
8
b c d
Q = {f, i, c, d, e}
4
0 ∞
11 6
a i e
7
8 6
h g f
1 2
8 1 2

u=g
G.Adj[u] = {f, h, i}
Example

4 8 ∞
8
b c d
Q = {i, c, d, e}
4
0 ∞
11 6
a i e
7
8 6
h g f
1 2
8 1 2

u=f
Example

4 8 ∞
8
b c d
Q = {i, c, d, e}
4
0 ∞
11 6
a i e
4 14
7
8 6 10
h g f
1 2
8 1 2

u=f
G.Adj[u] = {c, d, e, g}
Example

4 4 14
8
b c d
Q = {c, i, e, d}
4 10
0
11 6
a i e
4 14
7
8 6 10
h g f
1 2
8 1 2

u=f
G.Adj[u] = {c, d, e, g}
Example

4 4 14
8
b c d
Q = {i, e, d}
4 10
0
11 6
a i e
4 14
7
8 6 10
h g f
1 2
8 1 2

u=c
Example

4 4 14
8 7
b c d
Q = {i, e, d}
4 10
2
0
11 6
a i e
4 14
7
8 6 10
h g f
1 2
8 1 2

u=c
G.Adj[u] = {b, d, f, i}
Example

4 4 7
8 7
b c d
Q = {i, d, e}
4 10
2
0
11 2
a i e
4 14
7
8 6 10
h g f
1 2
8 1 2

u=c
G.Adj[u] = {b, d, f, i}
Example

4 4 7
8 7
b c d
Q = {d, e}
4 10
2
0
11 2
a i e
4 14
7
8 6 10
h g f
1 2
8 1 2

u=i
G.Adj[u] = {b, c, g, h}
Example

4 4 7
8 7
b c d
Q = {e}
4 9
2 10
0
11 2
a i e
4 14
7
8 6 10
h g f
1 2
8 1 2

u=d
Example

4 4 7
8 7
b c d
Q = {e}
4 9
2 9
0
11 2
a i e
4 14
7
8 6 10
h g f
1 2
8 1 2

u=d
G.Adj[u] = {c, e, f}
Example

4 4 7
8 7
b c d
Q = {}
4 9
2 9
0
11 2
a i e
4 14
7
8 6 10
h g f
1 2
8 1 2

u=e
G.Adj[u] = {d, f}
Example

8 7
b c d

4 9
2
11
a i e
14
7
8 6 10
h g f
1 2

Minimum spanning tree


Analysis of Prim’s algorithm

Line 1 - 3: O( |V| )

Line 4: O(1)

Line 5-11: ?
Analysis of Prim’s algorithm

Line 1 - 3: O( |V| )

Line 4: O(1)

Line 5-11: Depends on how we implement the min-priority queue Q


Analysis of Prim’s algorithm
● Depends on how we implement the min-priority queue Q
● If we implement Q as a binary min-heap
Line 1 - 3: O( |V| )

Line 4: O(1)

Line 5: O(|V|)
Line 6: The while loop executes |V| times.

Line 7:
Extract-Min takes O(lg V).
Therefore total time for all calls to Extract-Min is O(V lg V)
Analysis of Prim’s algorithm
● Depends on how we implement the min-priority queue Q
● If we implement Q as a binary min-heap
Line 1 - 3: O( |V| )

Line 4: O(1)

Line 5: O(|V|)
Line 6: The while loop executes |V| times.

Line 7:
Extract-Min takes O(lg V).
Therefore total time for all calls to Extract-Min is O(V lg V)

Line 8: The for loop executes O(E) since the sum of the lengths of all
adjacency lists is 2 |E|

Line 11: It decreases the key of the node in the min-heap, so the
heap needs to be adjusted, which a binary min-heap supports in
O(lg V) time.
Analysis of Prim’s algorithm
● Depends on how we implement the min-priority queue Q
● If we implement Q as a binary min-heap
Line 1 - 3: O( |V| )

Line 4: O(1)

Line 5: O(|V|)
Line 6: O(|V|).

Line 7: O(V lg V)

Line 8: O(E)

Line 11: O(E lg V)

Total time for Prim’s algo = O(V lg V + E lg V) = O(E lg V)


Analysis of Prim’s algorithm
● Depends on how we implement the min-priority queue Q
● If we implement Q as a Fibonacci heap
Line 1 - 3: O( |V| )

Line 4-5: O(1)

Line 6: O(|V|).

Line 7: O(V log V)

Line 8: O(E)

Line 11: O(E)

Total time for Prim’s algo = O(V lg V + E)


Shortest path algorithms
Shortest path algorithm
Finds the shortest path between two vertices in a graph

Applications:
● Finding the shortest path from one location to another in Google Maps,
MapQuest, OpenStreetMap, (KTM Public Route) etc.
● Used by Telephone networks, Cellular networks for routing/connection in
communication
● IP routing
● Word ladder problem
Single-source shortest path problem
The problem of finding shortest paths from a source vertex v to all other vertices in
the graph.

Optimal substructure of a shortest path

Shortest-path algorithms typically rely on the property that a shortest path between
two vertices contains other shortest paths within it.
Dijkstra's algorithm
A solution to the single-source shortest path problem in graph theory.

● Input: Weighted graph G = (V, E) and source vertex v ∈ V, such that all edge
weights are nonnegative
● Output: Lengths of shortest paths (or the shortest paths themselves) from a
given source vertex v ∈ V to all other vertices
Dijkstra’s shortest path algorithm
Steps:
1. Insert the first vertex into the tree
2. From every vertex already in the tree, examine the total path length to all
adjacent vertices not in the tree. Selected the edge with the minimum total
path weight and insert it into the tree
3. Repeat step 2 until all vertices are in the tree
Dijkstra’s shortest path algorithm

# Q is a min-priority queue
Dijkstra’s shortest path algorithm: Example
Find the shortest path from s to all other vertices.
Dijkstra’s shortest path algorithm: Example
Find the shortest path distance from s to all other vertices.

1
t x
10
9
2 3
s 4 6

7
5
y z
2
Dijkstra’s shortest path algorithm: Example
Find the shortest path distance from s to all other vertices.
∞ ∞
1
t x
10
9
2 3
0 s 4 6

7
5
y z
2
∞ ∞
Q = {s, t, x, y, z}
Dijkstra’s shortest path algorithm: Example
Find the shortest path distance from s to all other vertices.
∞ ∞
1
t x
10
9
2 3
0 s 4 6

7
5
y z
2
Q = {t, x, y, z} ∞ ∞
v=s
Neighbours of v = {t, y}
Dijkstra’s shortest path algorithm: Example
Find the shortest path distance from s to all other vertices.
10 ∞
1
t x
10
9
2 3
0 s 4 6

7
5
y z
2
Q = {t, x, y, z} ∞ ∞
v=s
Neighbours of v = {t, y}
d(s) + e(s, t) ≤ d(t) True
Dijkstra’s shortest path algorithm: Example
Find the shortest path distance from s to all other vertices.
10 ∞
1
t x
10
9
2 3
0 s 4 6

7
5
y z
2
Q = {t, x, y, z} 5 ∞
v=s
Neighbours of v = {t, y}
d(s) + e(s, y) ≤ d(y) True
Dijkstra’s shortest path algorithm: Example
Find the shortest path distance from s to all other vertices.
10 ∞
1
t x
10
9
2 3
0 s 4 6

7
5
y z
2
Q = {t, x, z} 5 ∞
v=y
Neighbours of v = {t, x, z}
Dijkstra’s shortest path algorithm: Example
Find the shortest path distance from s to all other vertices.
8 14
1
t x
10
9
2 3
0 s 4 6

7
5
y z
2
Q = {t, x, z} 5 7
v=y
Neighbours of v = {t, x, z}
d(y) + e(y, t) ≤ d(t) True
d(y) + e(y, x) ≤ d(x) True
d(y) + e(y, z) ≤ d(z) True
Dijkstra’s shortest path algorithm: Example
Find the shortest path distance from s to all other vertices.
8 14
1
t x
10
9
2 3
0 s 4 6

7
5
y z
2
Q = {t, x} 5 7
v=z
Neighbours of v = {x, s}
Dijkstra’s shortest path algorithm: Example
Find the shortest path distance from s to all other vertices.
8 13
1
t x
10
9
2 3
0 s 4 6

7
5
y z
2
Q = {t, x} 5 7
v=z
Neighbours of v = {x, s}
d(z) + e(z, x) ≤ d(x) True
d(z) + e(z, s) ≤ d(s) False
Dijkstra’s shortest path algorithm: Example
Find the shortest path distance from s to all other vertices.
8 13
1
t x
10
9
2 3
0 s 4 6

7
5
y z
2
Q = {x} 5 7
v=t
Neighbours of v = {x, y}
Dijkstra’s shortest path algorithm: Example
Find the shortest path distance from s to all other vertices.
8 9
1
t x
10
9
2 3
0 s 4 6

7
5
y z
2
Q = {x} 5 7
v=t
Neighbours of v = {x, y}
d(t) + e(t, x) ≤ d(x) True
d(t) + e(t, y) ≤ d(y) False
Running time
● Depends on the implementation
● The simplest implementation is to store vertices in an array or linked list. This
will produce a running time of O( |V|2 + |E| )
● For sparse graphs, or graphs with very few edges and many nodes, it can be
implemented more efficiently storing the graph in an adjacency list using a
binary heap or priority queue. This will produce a running time of
O( ( |E| + |V| ) log |V| )
A* search algorithm
A* (pronounced ‘A-star’) is a search algorithm that finds the shortest path between
some nodes S and T in a graph

It is a generalization of Dijkstra's algorithm that cuts down on the size of the


subgraph that must be explored using a heuristic function

Suppose we want to get to node T, and we are currently at node v. Informally, a


heuristic function h(v) is a function that ‘estimates’ how v is away from T
A* search algorithm

Dijkstra’s algorithm is a special case of A*, when we set h(v) = 0 for all v
Flow networks
Flow network
● A directed connected graph G = (V, E), where each edge e is associated with
its capacity c(e) > 0.
● If E contains an edge (u, v), then there is no edge (v, u) in the reverse direction
● We distinguish two vertices: a source s and a sink t (s ≠ t)
12
a b
16 20

s 4 9 t

13 4
c d
14
Flow network
A flow in G is a real-valued function f : V x V → R that satisfies the following two
properties:

Capacity constraint: Flow on edge e does not exceed its capacity c(e)
That is, for all u, v ∈ V, we require 0≤ f(u, v) ≤ c(u, v)

Flow conservation: For every node v ≠ s, t (nodes other than s and t), incoming flow
is equal to the outgoing flow.
That is, for all u ∈ V - {s, t}, we require
Network flow problem (maximum-flow)
Given: A flow network G with source s and sink t.

Problem: Find a flow of maximum value


Network flow problem
Alternative formulation: Minimum cut

● Remove some edges from the graph such that after removing the edges, there
is no path from s to t
● The cost of removing e is equal to its equal to its capacity c(e)
● The minimum cut problem is to find a cut with total minimum cut
Terminologies
Residual capacity of an edge a
6 / 12
b

Flow Capacity
Formally, given a flow network G = (V, E) with capacity c and flow f, the residual
capacity cf(u, v), where u, v ∈ V, is defined by
Terminologies
A Residual graph/network contains all the edges that have strictly positive residual
capacity.

Formally, given a flow network G = (V, E) and a flow f, the residual network of G
induced by f is Gf = (V, Ef), where

The edges in Ef are either edges in E or their reversals.


Terminologies
Given a flow network G and a flow f, the residual network Gf is constructed by
placing

1. Edges with residual capacity cf(u, v) = c(u, v) - f(u, v)


2. Edges with residual capacity cf(v, u) = f(u, v) (representing a decrease of a flow)

Example:
5

a b a b
5 / 12
7

Flow Capacity
Corresponding edge in the induced
residual network
An edge in a flow network
Terminologies
Augmenting path

Given a flow network G = (V, E) and a flow f, an augmenting path p is a simple path
from s to t in the residual network Gf.

By the definition of the residual network, we may increase the flow on an edge (u,v)
of an augmenting path by up to cf(u, v) without violating the capacity constraint on
whichever of (u, v) and (v, u) is in the original flow network G.

If there are augmenting paths, then the flow is not maximum yet.
Terminologies
Cuts of flow networks

A cut (S, T) of flow network G = (V, E) is a partition of V into S and T = V - S such that
s ∈ S and t ∈ T.
If f is a flow, then the net flow f(S,T) across the
cut (S, T) is

The capacity of the cut (S, T) is

A minimum cut of a network is a cut whose capacity is minimum over all cuts of the network.
Terminologies
Cuts of flow networks

A cut (S, T) of flow network G = (V, E) is a partition of V into S and T = V - S such that
s ∈ S and t ∈ T.
Example:
Here the cut is ({s, v1, v2}, {v3, v4, t}).

Net flow = f(v1, v3) + f(v2, v4) - f(v3, v2)


= 12 + 11 - 4
= 19

Capacity = c(v1, v3) + v(2, v4) = 12 + 14 = 26


Max-flow min-cut theorem
Ford-Fulkerson method
Main idea:
Find valid flow paths until there is none left, and add them up.

Steps:

1. Set ftotal= 0
2. Repeat until there is no path from s to t
a. Run DFS from s to find a flow path to t
b. Let f be the minimum capacity value on the path (aka bottleneck capacity)
c. Add f to ftotal
d. For each edge u → v on the path
i. Decrease c(u → v) by f
ii. Increase c(v → u) by f
Example
Augmenting paths:
Min.
12 capacity
a b
16 20 s→a→b→t 12

s→c→d→t 4
s 4 9 7 t
s→c→a→b→t 4
13 4
s→c→d→b→t 7
c d
14
s→a→b→c→d→t 4
Example

If we choose the augmenting path s → a → b → t, the induced residual graph would be as


follows:
12
a b Augmenting paths:
4 8
Min.
12 12 capacity
s 4 9 7 t
s→c→d→t 4
13 4 s→c→d→b→t 7
c d
14 Selected augmenting paths:
Flow

s→a→b→t 12
Example

And the corresponding flow network would be as follows:

12/12
a b Augmenting paths:
12/16 12/20
Min.
capacity
s 4 9 7 t
s→c→d→t 4
13 4 s→c→d→b→t 7
c d
14 Selected augmenting paths:
Flow

s→a→b→t 12
Example

If we choose the augmenting path s → c → d → b → t, the residual graph would be as follows:

12
a b Augmenting paths:
4 1
Min.
12 19 capacity
s 4 9 7 t
7 s→c→d→t 4
6 4 Selected augmenting paths:
7
c d
Flow
7
s→a→b→t 12

s→c→d→b→t 7
Example

And the corresponding flow network would be as follows:

12/12
a b Augmenting paths:
12/16 19/20
Min.
capacity
s 4 9 7/7 t
s→c→d→t 4
7/13 4 Selected augmenting paths:
c d
7/14 Flow

s→a→b→t 12

s→c→d→b→t 7
Example

If we choose the augmenting path s → c → d → t, the residual graph would be as follows:

12
a b No more augmenting paths
4 1

12 19 Selected augmenting paths:


s 4 9 7 t
Flow
11
2 4 s→a→b→t 12
3
c d
s→c→d→b→t 7
11
s→c→d→t 4
Example

There is no path from s to t, thus the max flow network is as below:

12/12
a b No more augmenting paths
12/16 19/20
Selected augmenting paths:
s 4 9 7/7 t
Flow

11/13 4/4 s→a→b→t 12


c d
11/14 s→c→d→b→t 7

s→c→d→t 4

Total flow 23
Computing minimum cut
● To compute minimum cut, we use the residual graph induced by the max flow
● Mark all nodes reachable from s
○ Call this set of reachable nodes A
● Now separate these nodes from the others
○ Cut edges from A to V-A
Computing minimum cut
Example: Compute the minimum cut in the following flow network

12
a b
16 20

s 4 9 7 t

13 4
c d
14
Computing minimum cut: Example
The residual graph induced by the max flow is as below:

12
a b
4 1
Nodes reachable from s
12 19 A = {a, c, d}
s 4 9 7 t
11 V-A = {b, t}
2 4
3
c d
11
Computing minimum cut: Example
Cut edges from A to V-A

12
a b
16 20

s 4 9 7 t

13 4
c d
14
Analysis of Ford-Fulkerson
● Running time depends on how we find augmenting paths
● If we choose it poorly, the algorithm might not even terminate
● For example, consider the following network

A
100 100

s 1 t

100 100
B
Analysis of Ford-Fulkerson
Augmenting path: s → A → B → t

A
99 100

1
s 1 t
1
100 99
B
Analysis of Ford-Fulkerson
Augmenting path: s → B → A → t

A
99 99

1 1
s 1 t
1 1
99 99
B
Analysis of Ford-Fulkerson
If we find the augmenting path by using a BFS/DFS, the algorithm runs in
polynomial time.
Analysis of Ford-Fulkerson

If we perform DFS/BFS to find augmenting paths, it will take O(V+E) time to find an
augmenting path.

The while loop of line 2-4 executes for at most | f | times since we may add 1 unit
flow in each iteration.

Thus, total time of Ford-Fulkerson algorithm is O (|f| (V + E)).


Applications
● Airline scheduling
○ Managing flight crews by reusing them over multiple flights
● Network connectivity
○ Routing as many packets as possible on a given network
○ Finding min number of connections (edges) whose removal disconnects t from s
● Project selection
○ Choosing a feasible subset of projects to maximize revenue
● Bipartite matching etc.
○ Finding an assignment of jobs to applicants in such that as many applicants as possible get
jobs.

You might also like