0% found this document useful (0 votes)
17 views

Graphs

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

Graphs

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 154

Graphs

Introduction
• A graph is an abstract data structure that is used to implement the graph
concept from mathematics. A graph is basically, a collection of vertices (also
called nodes) and edges that connect these vertices. A graph is often viewed as
a generalization of the tree structure, where instead of a having a purely
parent-to-child relationship between tree nodes, any kind of complex
relationships between the nodes can be represented.
Why graphs are useful?
• Graphs are widely used to model any situation where entities or things are
related to each other in pairs; for example, the following information can be
represented by graphs:
• Family trees in which the member nodes have an edge from parent to each of
their children.
• Transportation networks in which nodes are airports, intersections, ports, etc.
The edges can be airline flights, one-way roads, shipping routes, etc.
Definition

• A graph G is defined as an ordered set (V, E), where V(G) represent


the set of vertices and E(G) represents the edges that connect the
vertices.
A graph G = (V, E)
V = set of vertices
E = set of edges = subset of V × V

A B C

D E

• V(G) = { A, B, C, D , E}
• E(G) = { (A, B), (B, C), (A, D), (B, D), (D, E), (C, E) }.
• 5 vertices or nodes and 6 edges in the graph.
Graph Terminology

• Adjacent Nodes or Neighbors: For every edge, e = (u, v) that


connects nodes u and v; the nodes u and v are the end-points
and are said to be the adjacent nodes or neighbors.
• Degree of a node: Degree of a node u, deg(u), is the total
number of edges containing the node u. If deg(u) = 0, it means
that u does not belong to any edge and such a node is known as
an isolated node.
Graph Terminology

• Path: A path P, written as P = {v0, v1, v2,….., vn), of length n


from a node u to v is defined as a sequence of (n+1) nodes.
Here, u = v0, v = vn and vi-1 is adjacent to vi for i = 1, 2, 3,
…, n.
• Closed path: A path P is known as a closed path if the edge
has the same end-points. That is, if v0 = vn.
• Simple path: A path P is known as a simple path if all the
nodes in the path are distinct with an exception that v0 may be
.
equal to vn. If v0 = vn, then the path is called a closed simple
path.
• Cycle: A closed simple path with length 3 or more is known as
a cycle. A cycle of length k is called a k – cycle.
Graph Variations
● Variations:
■ A connected graph has a path from every vertex to every
other
■ Complete graph: A graph G is said to be a complete, if all
its nodes are fully connected, that is, there is a unique edge
from one node to every other node in the graph. A complete
graph has n(n-1)/2 edges, where n is the number of nodes in
G.
■ In an undirected graph:
○ Edge (u,v) = edge (v,u)
○ No self-loops
■ In a directed graph:
○ Edge (u,v) goes from vertex u to vertex v, notated u→v
Graph Variations

● More variations:
■ A weighted graph associates weights with either
the edges or the vertices
○ E.g., a road map: edges might be weighted with distance
■ A multigraph allows multiple edges between the
same vertices
○ E.g., the call graph in a program (a function can get
called from multiple points in another function)
Graph Terminology

(b) Tree (c) Weighted Graph


(a) Multi-graph
3 4
A B C
e1 A B C

A B e4
2
e2 7 1
e3 D E F

C e6 D B
e5 e7 3
Directed Graph

• A directed graph G, also known as a digraph, is a graph in which


every edge has a direction assigned to it. An edge of a directed
graph is given as an ordered pair (u, v) of nodes in G. For an edge
(u, v)-

• The edge begins at u and terminates at v

• U is known as the origin or initial point of e. Correspondingly, v is


known as the destination or terminal point of e

• U is the predecessor of v. Correspondingly, v is the successor of u


nodes u and v are adjacent to each other.
Terminology of a Directed Graph

• Out-degree of a node: The out degree of a node u, written as


outdeg(u), is the number of edges that originate at u.
• In-degree of a node: The in degree of a node u, written as
indeg(u), is the number of edges that terminate at u.
• Degree of a node: Degree of a node written as deg(u) is equal to
the sum of in-degree and out-degree of that node. Therefore,
deg(u) = indeg(u) + outdeg(u)
• Source: A node u is known as a source if it has a positive
out-degree but an in-degree = 0.
• Sink: A node u is known as a sink if it has a positive in degree but a
zero out-degree.
• Reachability: A node v is said to be reachable from node u, if and
only if there exists a (directed) path from node u to node v.
Terminology of a Directed Graph

• Strongly connected directed graph: A digraph is said to be


strongly connected if and only if there exists a path from every
pair of nodes in G. That is, if there is a path from node u to v, then
there must be a path from node v to u.
• Unilaterally connected graph: A digraph is said to be unilaterally
connected if there exists a path from any pair of nodes u, v in G
such that there is a path from u to v or a path from v to u but not
both.
• Parallel/Multiple edges: Distinct edges which connect the same
end points are called multiple edges. That is, e = {u, v) and e’ = (u,
v) are known as multiple edges of G.
• Simple directed graph: A directed graph G is said to be a simple
directed graph if and only if it has no parallel edges. However, a
simple directed graph may contain cycle with an exception that it
cannot have more than one loop at a given node
Graphs

● We will typically express running times in


terms of |E| and |V| (often dropping the |’s)
■ If |E| ≈ |V|2 the graph is dense
■ If |E| ≈ |V| the graph is sparse
● If you know you are dealing with dense or
sparse graphs, different data structures may
make sense
Representing Graphs

● Assume V = {1, 2, …, n}
● An adjacency matrix represents the graph as a
n x n matrix A:
■ A[i, j] = 1 if edge (i, j) ∈ E (or weight of edge)
= 0 if edge (i, j) ∉ E
Graphs: Adjacency Matrix

● Example:
A 1 2 3 4
1
a 1

2 d
4 2
3
b c
??
3 4
Graphs: Adjacency Matrix

● Example:
A 1 2 3 4
1
a 1 0 1 1 0

2 d
4 2 0 0 1 0
b c 3 0 0 0 0
3 4 0 0 1 0
Graphs: Adjacency Matrix

● How much storage does the adjacency matrix


require?
● A: O(V2)
● Undirected graph → matrix is symmetric
● No self-loops → don’t need diagonal
Graphs: Adjacency Matrix

● The adjacency matrix is a dense representation


■ Usually too much storage for large graphs
■ But can be very efficient for small graphs
● Most large interesting graphs are sparse
■ For this reason the adjacency list is often a more
appropriate respresentation
Graphs: Adjacency List

● Adjacency list: for each vertex v ∈ V, store a


list of vertices adjacent to v
● Example:
1
■ Adj[1] = {2,3}
■ Adj[2] = {3}
■ Adj[3] = {}
2 4
■ Adj[4] = {3}
● Variation: can also keep 3
a list of edges coming into vertex
Graphs: Adjacency List
● So: Adjacency lists take O(V+E) storage
Graph Searching or Traversal

● Given: a graph G = (V, E), directed or


undirected
● Goal: methodically explore every vertex and
every edge
● Ultimately: build a tree on the graph
■ Pick a vertex as the root
■ Choose certain edges to produce a tree
■ Note: might also build a forest if graph is not
connected
Breadth-First Search

● “Explore” a graph, turning it into a tree


■ One vertex at a time
■ Expand frontier of explored vertices across the
breadth of the frontier
● Builds a tree over the graph
■ Pick a source vertex to be the root
■ Find (“discover”) its children, then their children,
etc.
Breadth-First Search
● Will associate vertex “colors” to guide the
algorithm
■ White vertices have not been discovered
○ All vertices start out white
■ Grey vertices are discovered but not fully explored
○ They may be adjacent to white vertices
■ Black vertices are discovered and fully explored
○ They are adjacent only to black and gray vertices
● Explore vertices by scanning adjacency list of
grey vertices
Breadth-First Search
BFS(G, s) {
initialize vertices;
Q = {s}; // Q is a queue ; initialize to s
while (Q not empty) {
u = Remove(Q);
for each v ∈ adj[u] {
if (color[v] == WHITE)
color[v] = GREY;
d[v] = d[u] + 1;
p[v] = u; What does d[v] represent?
Enqueue(Q, v); What does p[v] represent?
}
color[u] = BLACK;
}
} Note: Notation color [v] indicates the value of
color attribute of node v. Similar notation is
used throughout these slides on graphs
Breadth-First Search: Example

r s t u

∞ ∞ ∞ ∞

∞ ∞ ∞ ∞
v w x y
Breadth-First Search: Example

r s t u

∞ 0 ∞ ∞

∞ ∞ ∞ ∞
v w x y

Q: s
Breadth-First Search: Example

r s t u

1 0 ∞ ∞

∞ 1 ∞ ∞
v w x y

Q: w r
Breadth-First Search: Example

r s t u

1 0 2 ∞

∞ 1 2 ∞
v w x y

Q: r t x
Breadth-First Search: Example

r s t u

1 0 2 ∞

2 1 2 ∞
v w x y

Q: t x v
Breadth-First Search: Example

r s t u

1 0 2 3

2 1 2 ∞
v w x y

Q: x v u
Breadth-First Search: Example

r s t u

1 0 2 3

2 1 2 3
v w x y

Q: v u y
Breadth-First Search: Example

r s t u

1 0 2 3

2 1 2 3
v w x y

Q: u y
Breadth-First Search: Example

r s t u

1 0 2 3

2 1 2 3
v w x y

Q: y
Breadth-First Search: Example

r s t u

1 0 2 3

2 1 2 3
v w x y

Q: Ø
Breadth-First Search: Properties

● BFS calculates the shortest-path distance to


the source node
■ Shortest-path distance δ(s,v) = minimum number
of edges from s to v, or ∞ if v not reachable from s
● BFS builds breadth-first tree, in which paths to
root represent shortest paths in G
Depth-First Search

● Depth-first search is another strategy for


exploring a graph
■ Explore “deeper” in the graph whenever possible
■ Edges are explored out of the most recently
discovered vertex v that still has unexplored edges
■ When all of v’s edges have been explored,
backtrack to the vertex from which v was
discovered
Depth-First Search

● Vertices initially colored white


● Then colored gray when discovered
● Then black when finished
Depth-First Search: The Code
DFS(G) DFS_Visit(u)
{
{
color[u] = GREY;
for each vertex u ∈ V[G] time = time+1;
{ d[u] = time;
color[u] = WHITE; for each v ∈ Adj[u]
} {
time = 0; if (color[v] == WHITE)
for each vertex u ∈ V[G] DFS_Visit(v);
}
{
color[u] = BLACK;
if (color[u] == WHITE)
time = time+1;
DFS_Visit(u); f[u] = time;
} }
}
DFS Example
source
vertex
DFS Example
source
vertex d
f
1 | | |

| |

| | |
DFS Example
source
vertex d
f
1 | | |

2 | |

| | |
DFS Example
source
vertex d
f
1 | | |

2 | |

3 | | |
DFS Example
source
vertex d
f
1 | | |

2 | |

3 |
| |
4
DFS Example
source
vertex d
f
1 | | |

2 | |

3 |
5 | |
4
DFS Example
source
vertex d
f
1 | | |

2 | |

3 | 5 |
|
4 6
DFS Example
source
vertex d
f
1 | 8 | |

2 |
|
7

3 | 5 |
|
4 6
DFS Example
source
vertex d
f
1 | 8 | |

2 |
|
7

3 | 5 |
|
4 6
DFS Example
source
vertex d
f
1 | 8 | |

2 |
9 |
7

3 | 5 |
|
4 6
DFS Example
source
vertex d
f
1 | 8 | |

2 | 9
7 |10

3 | 5 |
|
4 6
DFS Example
source
vertex d
f
8
1 | |
|11

2 | 9
7 |10

3 | 5 |
|
4 6
DFS Example
source
vertex d
f
1 8
|
|12 |11

2 | 9
7 |10

3 | 5 |
|
4 6
DFS Example
source
vertex d
f
1 8
13|
|12 |11

2 | 9
7 |10

3 | 5 |
|
4 6
DFS Example
source
vertex d
f
1 8
13|
|12 |11

2 | 9
7 |10

3 | 5 |
14|
4 6
DFS Example
source
vertex d
f
1 8
13|
|12 |11

2 | 9
7 |10

3 | 5 | 14|
4 6 15
DFS Example
source
vertex d
f
1 8 13|
|12 |11 16

2 | 9
7 |10

3 | 5 | 14|
4 6 15
Minimum Spanning Trees
Spanning Tree

● A spanning tree of a graph, G, is a set of |V|-1


edges that connect all vertices of the graph.
Thus a spanning tree for G is a subgraph of G,
T = (V’, E’) with the following properties:
■ V’ = V
■ T is connected
■ T is acyclic.
Problem: Laying Telephone Wire

Central office
Wiring: Naïve Approach

Central office

Expensive!
Wiring: Better Approach

Central office

Minimize the total length of wire connecting the customers


Minimum Spanning Tree

● Problem: given a connected, undirected,


weighted graph:

6 4
5 9

14 2
10
15

3 8
Minimum Spanning Tree

● Problem: given a connected, undirected,


weighted graph, find a spanning tree using
edges that minimize the total weight
6 4
5 9

14 2
10
15

3 8
Minimum Spanning Tree

● Which edges form the minimum spanning tree


(MST) of the below graph?

A
6 4
5 9
H B C

14 2
10
15
G E D
3 8
F
Minimum Spanning Tree

● Answer:

A
6 4
5 9
H B C

14 2
10
15
G E D
3 8
F
Generic MST

A=ф
While A does not form a spanning tree
do find an edge (u, v) that is safe for A
A = A U {(u,v)}
Return A
Kruskal’s Algorithm
Kruskal()
{
T = ∅;
sort E by increasing edge weight w
for each (u,v) ∈ E (in sorted order)
if adding (u,v) does not form a cycle)
T = T U {{u,v}};
}
Kruskal’s Algorithm

Run the algorithm:


2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1?
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2? 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5?
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8? 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9?
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13? 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14? 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17?
8 25
5
21 13 1
Kruskal’s Algorithm

2 19?
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21? 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25?
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Kruskal’s Algorithm

2 19
9
14 17
8 25
5
21 13 1
Prim’s Algorithm
MST-Prim(G, w, r)
Q = V[G];
for each u ∈ Q
key[u] = ∞;
key[r] = 0;
p[r] = NULL;
while (Q not empty)
u = ExtractMin(Q);
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) 6 4
Q = V[G]; 5 9
for each u ∈ Q
key[u] = ∞; 14
10 2
key[r] = 0;
15
p[r] = NULL;
while (Q not empty) 3 8
u = ExtractMin(Q);
for each v ∈ Adj[u] Run on example graph
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) 6 ∞ 4
Q = V[G]; 5 9
∞ ∞ ∞
for each u ∈ Q
key[u] = ∞; 14 2
10
key[r] = 0; 15
∞ ∞ ∞
p[r] = NULL;
while (Q not empty) 3 ∞ 8
u = ExtractMin(Q); Run on example graph
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) 6 ∞ 4
Q = V[G]; 5 9
∞ ∞ ∞
for each u ∈ Q
key[u] = ∞; 14 2
10
key[r] = 0; 15
r 0 ∞ ∞
p[r] = NULL;
while (Q not empty) 3 ∞ 8
u = ExtractMin(Q); Pick a start vertex r
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) 6 ∞ 4
Q = V[G]; 5 9
∞ ∞ ∞
for each u ∈ Q
key[u] = ∞; 14 2
10
key[r] = 0; 15
u 0 ∞ ∞
p[r] = NULL;
while (Q not empty) 3 ∞ 8
u = ExtractMin(Q); Red vertices have been removed from Q
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) 6 ∞ 4
Q = V[G]; 5 9
∞ ∞ ∞
for each u ∈ Q
key[u] = ∞; 14 2
10
key[r] = 0; 15
u 0 ∞ ∞
p[r] = NULL;
while (Q not empty) 3 3 8
u = ExtractMin(Q); Red arrows indicate parent pointers
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) 6 ∞ 4
Q = V[G]; 1 5 9
∞ ∞
for each u ∈ Q 4
key[u] = ∞; 14 2
10
key[r] = 0; 15
u 0 ∞ ∞
p[r] = NULL;
while (Q not empty) 3 3 8
u = ExtractMin(Q);
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) 6 ∞ 4
Q = V[G]; 1 5 9
∞ ∞
for each u ∈ Q 4
key[u] = ∞; 14 2
10
key[r] = 0; 15
0 ∞ ∞
p[r] = NULL;
while (Q not empty) 3 3 8
u
u = ExtractMin(Q);
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) 6 ∞ 4
Q = V[G]; 1 5 9
∞ ∞
for each u ∈ Q 4
key[u] = ∞; 14 2
10
key[r] = 0; 15
0 8 ∞
p[r] = NULL;
while (Q not empty) 3 3 8
u
u = ExtractMin(Q);
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) 6 ∞ 4
Q = V[G]; 1 5 9
∞ ∞
for each u ∈ Q 0
key[u] = ∞; 14 2
10
key[r] = 0; 15
0 8 ∞
p[r] = NULL;
while (Q not empty) 3 3 8
u
u = ExtractMin(Q);
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) 6 ∞ 4
Q = V[G]; 1 5 9
∞ ∞
for each u ∈ Q 0
key[u] = ∞; 14 2
10
key[r] = 0; 15
0 8 ∞
p[r] = NULL;
while (Q not empty) 3 3 8
u
u = ExtractMin(Q);
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) 6 ∞ 4
Q = V[G]; 1 5 9
2 ∞
for each u ∈ Q 0
key[u] = ∞; 14 2
10
key[r] = 0; 15
0 8 ∞
p[r] = NULL;
while (Q not empty) 3 3 8
u
u = ExtractMin(Q);
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) 6 ∞ 4
Q = V[G]; 1 5 9
2 ∞
for each u ∈ Q 0
key[u] = ∞; 14 2
10
key[r] = 0; 15 1
0 8
p[r] = NULL; 5
while (Q not empty) 3 3 8
u
u = ExtractMin(Q);
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) ∞ u
6 4
Q = V[G]; 1 5 9
2 ∞
for each u ∈ Q 0
key[u] = ∞; 14 2
10
key[r] = 0; 15 1
0 8
p[r] = NULL; 5
while (Q not empty) 3 3 8
u = ExtractMin(Q);
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) ∞ u
6 4
Q = V[G]; 1 5 9
2 9
for each u ∈ Q 0
key[u] = ∞; 14 2
10
key[r] = 0; 15 1
0 8
p[r] = NULL; 5
while (Q not empty) 3 3 8
u = ExtractMin(Q);
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) 4 u
6 4
Q = V[G]; 1 5 9
2 9
for each u ∈ Q 0
key[u] = ∞; 14 2
10
key[r] = 0; 15 1
0 8
p[r] = NULL; 5
while (Q not empty) 3 3 8
u = ExtractMin(Q);
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) 4 u
6 4
Q = V[G]; 5 9
5 2 9
for each u ∈ Q
key[u] = ∞; 14 2
10
key[r] = 0; 15 1
0 8
p[r] = NULL; 5
while (Q not empty) 3 3 8
u = ExtractMin(Q);
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
u
MST-Prim(G, w, r) 6 4 4
Q = V[G]; 5 9
5 2 9
for each u ∈ Q
key[u] = ∞; 14 2
10
key[r] = 0; 15 1
0 8
p[r] = NULL; 5
while (Q not empty) 3 3 8
u = ExtractMin(Q);
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) u 4
6 4
Q = V[G]; 5 9
5 2 9
for each u ∈ Q
key[u] = ∞; 14 2
10
key[r] = 0; 15 1
0 8
p[r] = NULL; 5
while (Q not empty) 3 3 8
u = ExtractMin(Q);
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
u
MST-Prim(G, w, r) 6 4 4
Q = V[G]; 5 9
5 2 9
for each u ∈ Q
key[u] = ∞; 14 2
10
key[r] = 0; 15 1
0 8
p[r] = NULL; 5
while (Q not empty) 3 3 8
u = ExtractMin(Q);
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
MST-Prim(G, w, r) 6 4 4
Q = V[G]; 5 9
5 2 9
for each u ∈ Q
key[u] = ∞; 14 2 u
10
key[r] = 0; 15 1
0 8
p[r] = NULL; 5
while (Q not empty) 3 3 8
u = ExtractMin(Q);
for each v ∈ Adj[u]
if (v ∈ Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Simplified Algorithm

1. Initialize a tree with a single vertex, chosen


arbitrarily from the graph.
2. Grow the tree by one edge: of the edges that
connect the tree to vertices not yet in the tree,
find the minimum-weight edge, and transfer it
to the tree.
3. Repeat step 2 (until all vertices are in the tree).
Example

● Use Kruskal’s algorithm to find a minimum


spanning tree in the following weighted graph.
Use alphabetical order to break ties.
Example (contd.)
● Solution: Kruskal’s algorithm will proceed as follows.
■ First we add edge {d, e} of weight 1.
■ Next, we add edge {c, e} of weight 2.
■ Next, we add edge {d, z} of weight 2.
■ Next, we add edge {b, e}of weight 3.
■ And finally, we add edge {a, b} of weight 2.
■ This produces a minimum spanning tree of weight 10. A minimum
spanning tree is the following.
Example – Find MST using Prim’s
Algorithm (r = D)
Difference between Kruskal’s and
Prim’s Algorithm
● Prim’s algorithm initializes with a node whereas Kruskal
algorithm initializes with an edge.
● Kruskal's algo builds a minimum spanning tree by
adding one edge at a time. The next line is always the
shortest (minimum weight) ONLY if it does NOT create
a cycle. Prims builds a minimum spanning tree by
adding one vertex at a time. The next vertex to be added
is always the one nearest to a vertex already on the
graph.
● In prim’s algorithm, a tree is maintained in all stages
while in the Kruskal’s algorithm a forest is maintained.
Single-Source Shortest Path

Dijkstra’s Algorithm
Shortest Paths Problems
1
v t
5 4 9
8 13
u 3 4 y
x
2 10
1 1 2
w z
6

● Given a weighted directed graph,


<u,v,t,x,z> is a path of weight 29 from u to z.
<u,v,w,x,y,z> is another path from u to z; it has
weight 16 and is the shortest path from u to z.
Variants of Shortest Paths
Problems
A. Single pair shortest path problem
○ Given s and d, find shortest path from s to d.
B. Single source shortest paths problem
○ Given s, for each d find shortest path from s to d.
(Dijkstra’s algorithm)
C. All-pairs shortest paths problem
○ For each ordered pair s,d, find shortest path. (warshall
Algorithm)
● (A) and (B) seem to have same asymptotic complexity.
Single-Source Shortest Path

● Problem: given a weighted directed graph G,


find the minimum-weight path from a given
source vertex s to another vertex v
■ “Shortest-path” = minimum weight
■ Weight of path is sum of edges
■ E.g., a road map: what is the shortest path from
Chapel Hill to Charlottesville?
Relaxation

● A key technique in shortest path algorithms is


relaxation
■ Idea: for all v, maintain upper bound d[v] on δ(s,v)
Relax(u,v,w) {
if (d[v] > d[u]+w) then d[v]=d[u]+w;
}
2 2
5 9 5 6

Relax Relax
2 2
5 7 5 6
Dijkstra’s Algorithm
Dijkstra(G)
for each v ∈ V
d[v] = ∞;
d[s] = 0; S = ∅; Q = V;
while (Q ≠ ∅)
u = ExtractMin(Q);
S = S U {u};
for each v ∈ Adj[u]
if (d[v] > d[u]+w(u,v))
Relaxation
d[v] = d[u]+w(u,v); Step
Example
u v
1
∞ ∞

10
9
2 3
s 0 4 6

5 7

∞ ∞
2
x y
Example
u v
1 1

0
10
9
2 3
s 0 4 6

5 7

5 ∞
2
x y
Example
u v
1 1
8
4
10
9
2 3
s 0 4 6

5 7

5 7
2
x y
Example
u v
1 1
8
3
10
9
2 3
s 0 4 6

5 7

5 7
2
x y
Example
u v
1
8 9

10
9
2 3
s 0 4 6

5 7

5 7
2
x y
Example
u v
1
8 9

10
9
2 3
s 0 4 6

5 7

5 7
2
x y
Example

B
10 2
source A 4 3 D
5 1
C
Ex: run the algorithm
All Pairs Shortest Path

Floyd-Warshall Algorithm
Intermediate Vertices
Without loss of generality, we will assume that
V={1,2,…,n}, i.e., that the vertices of the graph
are numbered from 1 to n.

Given a path p=(1,2,…,m) in the graph, we will


call the vertices k with k in {2,…,m-1} the
intermediate vertices of p.

129
Intermediate Vertices
Consider a shortest path p from i to j such that
the intermediate vertices are from the set
{1,…,k}.
● If the vertex k is not an intermediate vertex on p,
then dij(k) = dij(k-1)
● If the vertex k is an intermediate vertex on p,
then dij(k) = dik(k-1) + dkj(k-1)
Interestingly, in either case, the subpaths contain merely nodes
from {1,…,k-1}.
Shortest Path
Therefore, we can conclude that

dij(k) = min{dij(k-1) , dik(k-1) + dkj(k-1)}

intermediate vertices in {1, 2, …, k}


Recursive Formulation
If we do not use intermediate nodes, i.e., when
k=0, then
dij(0) = wij
If k>0, then
dij(k) = min{dij(k-1) , dik(k-1) + dkj(k-1)}

132
Floyd-Warshall Algorithm

Input: Digraph G = (V, E), where |V| = n, with edge-weight


function w : E -> W .
We assume that the input is represented by a weight matrix W=
(wij)i,j in E that is defined by
wij= 0 if i=j
wij= w(i,j) if i≠j and (i,j) in E
wij= ∞ if i≠j and (i,j) not in E

Output: n × n matrix of shortest-path lengths


δ(i, j) for all i, j ∈ V.
If the graph has n vertices, we return a distance matrix (dij),
where dij the length of the path from i to j.
The Floyd-Warshall Algorithm
Floyd-Warshall(W)
n = # of rows of W;
D(0) = W;
for k = 1 to n do
for i = 1 to n do
for j = 1 to n do
dij(k) = min{dij(k-1) , dik(k-1) + dkj(k-1)};
return D(n);

134
Example
5 1 2 3
D(0)
1 3
1 0 8 5
8
3 2 2 3 0 ∞
2 3 ∞ 2 0

● At each step k dij satisfies the criteria


■ Path begins at i & ends at j
■ Intermediate vertices on the path come from the set {1, 2,
…, k}
Example
5 1 2 3
D(0)
1 3
1 0 8 5
8
3 2 2 3 0 ∞
2 3 ∞ 2 0

● Step 1:- Consider all paths which contain 1 as a


intermediate vertex
■ 2,1,3 is the only path
■ dij(1) = min{dij(0) , dik(0) + dkj(0)}
■ d23(1) = min{d23(0) , d21(0) + d13(0)};
Example
5 1 2 3
D(1)
1 3
1 0 8 5
8
3 2 2 3 0 8
2 3 ∞ 2 0

● Step 1:- Consider all paths which contain 1 as a


intermediate vertex
■ 2,1,3 is the only path
■ dij(1) = min{dij(0) , dik(0) + dkj(0)}
■ d23(1) = min{d23(0) , d21(0) + d13(0)} = 8
Example
5 1 2 3
D(1)
1 3
1 0 8 5
8
3 2 2 3 0 8
2 3 ∞ 2 0

● Step 2:- Consider all paths which contain 2 as a


intermediate vertex
■ 3,2,1 is the only path
■ dij(2) = min{dij(1) , dik(1) + dkj(1)}
■ d31(2) = min{d31(1) , d32(1) + d21(1)};
Example
5 1 2 3
D(2)
1 3
1 0 8 5
8
3 2 2 3 0 8
2 3 5 2 0

● Step 2:- Consider all paths which contain 2 as a


intermediate vertex
■ 3,2,1 is the only path
■ dij(2) = min{dij(1) , dik(1) + dkj(1)}
■ d31(2) = min{d31(1) , d32(1) + d21(1)} = 5
Example
5 1 2 3
D(2)
1 3
1 0 8 5
8
3 2 2 3 0 8
2 3 5 2 0

● Step 3:- Consider all paths which contain 3 as a


intermediate vertex
■ 1,3,2 is the only path
■ dij(3) = min{dij(2) , dik(2) + dkj(2)}
■ d12(3) = min{d12(2) , d13(2) + d32(2)};
Example
5 1 2 3
D(3)
1 3
1 0 7 5
8
3 2 2 3 0 8
2 3 5 2 0

● Step 3:- Consider all paths which contain 3 as a


intermediate vertex
■ 1,3,2 is the only path
■ dij(3) = min{dij(2) , dik(2) + dkj(2)}
■ d12(3) = min{d12(2) , d13(2) + d32(2)} = 7
Example
5 1 2 3
D(3)
1 3
1 0 7 5
8
3 2 2 3 0 8
2 3 5 2 0

Final Result

● We have obtained the value of an optimal solution.


● To obtain the actual solution i.e. the shortest path
between each pair of vertices, use the predecessor
matrix
Predecessor Matrix

● pij(0) = NIL if i = j or wij = ∞


i if i ≠ j & wij < ∞

(k) (k-1)
● pij = pij if dij(k-1) ≤ dik(k-1) + dkj(k-1)
pkj(k-1) if dij(k-1) > dik(k-1) + dkj(k-1)
Example
5 1 2 3
D(1)
1 3
1 0 8 5
8
3 2 2 3 0 8
2 3 ∞ 2 0

1 2 3 1 2 3
P(0) P(1)
1 N 1 1 1 N 1 1
2 2 N N 2 2 N 1
3 N 3 N 3 N 3 N
Example
5 1 2 3
D(2)
1 3
1 0 8 5
8
3 2 2 3 0 8
2 3 5 2 0

1 2 3
P(2)
1 N 1 1
2 2 N 1
3 2 3 N
Example
5 1 2 3
D(3)
1 3
1 0 7 5
8
3 2 2 3 0 8
2 3 5 2 0

1 2 3
P(3)
Use this final predecessor 1 N 3 1
matrix to obtain the
optimal result 2 2 N 1
3 2 3 N
Printing the Shortest Path between vertices i & j

print( P, i, j)
{
if i = j then print i
else if pij = NIL then print “No Path”
else {
print(P, i, pij)
print j
}
}
Example
Floyd-Warshall Algorithm &
Transitive Closure of a Graph
Transitive closure of a directed graph
● The transitive closure of G is defined as the
graph G* = (V, E*), where E* = { (i, j) : there
is a path from vertex i to j in G}
Transitive closure of a directed graph

Compute tij = 1 if there exists a path from i to j,


0 otherwise.

IDEA: Use Floyd-Warshall, but with (Λ, V) instead


of (min, +):

tij(0) = 0 if i ≠ j and (i,j) not in E,


1 if i = j or (i,j) in E.

tij(k) = tij(k–1) V (tik(k–1) Λ tkj(k–1)).


Time = Θ(n3).
Example
T(0) 1 2 3 4 T(1) 1 2 3 4
1 2
1 1 0 0 0 1 1 0 0 0
2 0 1 1 1 2 0 1 1 1
4 3
3 0 1 1 0 3 0 1 1 0
4 1 0 1 1 4 1 0 1 1

T(2) 1 2 3 4 T(3) 1 2 3 4 T(4) 1 2 3 4


1 1 0 0 0 1 1 0 0 0 1 1 0 0 0
2 0 1 1 1 2 0 1 1 1 2 1 1 1 1
3 0 1 1 1 3 0 1 1 1 3 1 1 1 1
4 1 0 1 1 4 1 1 1 1 4
1 1 1 1

You might also like