Shortest Path Problem: Habib University
Shortest Path Problem: Habib University
Project Report
Spring 2022
Contents
1 Introduction 1
1.1 What is shortest path problem? . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Cyclic and acylic graph . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.3 Directed and un-directed graph . . . . . . . . . . . . . . . . . . . . . . . . 1
1.4 Negative cycle and negative weighted cycle . . . . . . . . . . . . . . . . . . 1
1.5 Algorithms for determining the shortest path . . . . . . . . . . . . . . . . . 1
2 Algorithms 2
2.1 Breath First Search (BFS) . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
2.1.1 Basic Intuition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
2.1.2 Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
2.1.3 Complexity - Theoretical Analysis . . . . . . . . . . . . . . . . . . . 5
2.1.4 Assumptions and Limitations . . . . . . . . . . . . . . . . . . . . . 5
2.2 Dijkstra’s Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.2.1 Basic Intuition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.2.2 Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.2.3 Complexity - Theoretical Analysis . . . . . . . . . . . . . . . . . . . 8
2.2.4 Assumptions and Limitations . . . . . . . . . . . . . . . . . . . . . 8
2.3 Bellman-Ford Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.3.1 Basic Intuition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.3.2 Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.3.3 Complexity - Theoretical Analysis . . . . . . . . . . . . . . . . . . . 11
2.3.4 Assumptions and Limitations . . . . . . . . . . . . . . . . . . . . . 11
2.4 Floyd–Warshall Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.4.1 Basic Intuition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.4.2 Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.4.3 Complexity - Theoretical Analysis . . . . . . . . . . . . . . . . . . . 14
2.4.4 Assumptions and Limitations . . . . . . . . . . . . . . . . . . . . . 14
2.5 Fibonacci Heap Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.5.1 Basic Intuition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.5.2 Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.5.3 Complexity - Theoretical Analysis . . . . . . . . . . . . . . . . . . . 21
2.5.4 Assumptions and Limitations . . . . . . . . . . . . . . . . . . . . . 21
3 Empirical Analysis 22
3.1 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.2 Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
i
4 Theoretical Comparison 29
5 Task Distribution 30
ii
Chapter 1
Introduction
1
Chapter 2
Algorithms
2.1.2 Algorithm
The pseudocode for BFS algorithm is given below: [2]
2
6 v := Q . dequeue ()
7 if v is the goal then
8 return v
9 for all edges from v to w in G . adjacentEdges ( v ) do
10 if w is not labeled as explored then
11 label w as explored
12 Q . enqueue ( w )
13
1 import time
2 import numpy as np
3 import random
4 import matplotlib . pyplot as plt
5 from collections import defaultdict
6
7 # This class represents a directed graph
8 # using adjacency list representation
9 class Graph :
10
11 # Constructor
12 def __init__ ( self ) :
13 self . graph = {}
14 def addNodes ( self , u ) :
15 self . graph [ u ] = []
16 def addEdge ( self ,u , v ) :
17 self . graph [ u ]. append ( v )
18 def BFS ( self , s ) :
19 visited = [ False ] * ( max ( self . graph ) + 1)
20 queue = []
21 queue . append ( s )
22 visited [ s ] = True
23
24 while queue :
25 print ( " true " )
26 v = queue . pop (0)
27 # print (s , end = " ")
28 for i in self . graph [ v ]:
29 if visited [ i ] == False :
30 queue . append ( i )
31 visited [ i ] = True
32
33
34 # g = Graph ()
35 # nodez = np . random . randint (5 , size =200)
36 # # print ( nodez )
37 # for z in range ( len ( nodez ) ) :
38 # if z < len ( nodez ) -1:
39 # g . addEdge ( nodez [ z ] , nodez [ z +1])
40 # print (" Add - " , nodez [ z ] , " " , nodez [ z +1])
41 # g . BFS (0)
42 # while True :
43 # try :
44 # start_time = time . time ()
45 # g . BFS (0)
46 # end_time = time . time ()
47 # print (" Total time = " , end_time - start_time )
3
48 # break
49 # except :
50 # pass
51
52 x = []
53 y = []
54
55
56 for i in range (5 ,225 ,25) :
57 g = Graph ()
58
59 nodez =[]
60 # print ( nodez )
61 for z in range ( i ) :
62 nodez . append ( z )
63 g . addNodes ( z )
64
65
66
67
68 for k in range ( len ( nodez ) ) :
69 for j in range ( len ( nodez ) ) :
70 if k != j :
71
84
85
86
87
88
89 print ( x )
90 print ( y )
91
92
93 plt . plot (x , y , label = " Time Taken Per n Vertices " , color = " red " ,
94 marker = " * " )
95
96 # y - axis label
97 plt . ylabel ( ' Time Taken By BFS ( in seconds ) ')
98 # x - axis label
99 plt . xlabel ( ' Number of Vertices In The Graph ')
100 # plot title
101 plt . title ( ' BFS Algorithm - Time Analysis ')
102 # showing legend
103 plt . legend ()
104
105 # function to show the plot
4
106 plt . show ()
5
2.2 Dijkstra’s Algorithm
Dijkstra’s algorithm is a method for determining the shortest paths between nodes in a
graph, which can be used to represent, for instance, road networks. It was conceived
in 1956 and published three years later by computer scientist Edsger W. Dijkstra. An
SSSP (Single Source Shortest Path) algorithm is Dijkstra’s algorithm. It determines the
shortest route between any two points in the graph that is to determine the shortest path
between a source node to all other nodes in the graph. The shortest path is determined
using a greedy algorithm.
2.2.2 Algorithm
The pseudocode for Dijkstra algorithm is given below: [3]
6
20 dist [ v ] := alt
21 previous [ v ] := u
22 return previous [ ]
23
1 import sys
2 import time
3 import numpy as np
4 import matplotlib . pyplot as plt
5
6
7 class Graph () :
8
9 def __init__ ( self , vertices ) :
10 self . V = vertices
11 self . graph = [[0 for column in range ( vertices ) ]
12 for row in range ( vertices ) ]
13 def printSolution ( self , dist ) :
14 print ( " Vertex \ tDistance from Source " )
15 for node in range ( self .V ) :
16 print ( node , " \ t " , dist [ node ])
17 def minDistance ( self , dist , sptSet ) :
18 min = sys . maxsize
19 for v in range ( self . V ) :
20 if dist [ v ] < min and sptSet [ v ] == False :
21 min = dist [ v ]
22 min_index = v
23 return min_index
24 def dijkstra ( self , src ) :
25 dist = [ sys . maxsize ] * self . V
26 dist [ src ] = 0
27 sptSet = [ False ] * self . V
28 for _ in range ( self . V ) :
29 u = self . minDistance ( dist , sptSet )
30 sptSet [ u ] = True
31 for v in range ( self . V ) :
32 if self . graph [ u ][ v ] > 0 and sptSet [ v ] == False and \
33 dist [ v ] > dist [ u ] + self . graph [ u ][ v ]:
34 dist [ v ] = dist [ u ] + self . graph [ u ][ v ]
35 self . printSolution ( dist )
36 # g = Graph (5000)
37 # g . graph = np . random . randint (5 , size =(5000 , 5000) )
38 # g . dijkstra (0)
39 # while True :
40 # try :
41 # start_time = time . time ()
42 # g . dijkstra (0)
43 # end_time = time . time ()
44 # print (" Total time = " , end_time - start_time )
45 # break
46 # except :
47 # pass
48
7
51
52 x = [25 , 50 , 100 , 500 , 1000 , 2000 , 3000 , 4000 , 5000]
53 y = [0.013002634048461914 , 0.03902769088745117 , 0.11808395385742188 ,
1.317941665649414 ,
54 4.752390623092651 ,28.34447145462036 , 69.9988374710083 , 124.48891639709473 ,
146.03596019744873]
55
56 plt . plot (x , y , label = " Time Taken Per n Vertices " , color = " red " , marker = " * " )
57 plt . ylabel ( ' Time Taken By Dijkstra ( in seconds ) ')
58 plt . xlabel ( ' Number of Vertices In The Graph ')
59 plt . title ( ' Dijkstra Algorithm - Time Analysis ')
60 plt . legend ()
61 plt . show ()
O((V + E) ∗ log(V ))
O((E) + V ∗ log(V ))
O((V 2 ) + E ∗ log(V ))
8
2.3 Bellman-Ford Algorithm
The Bellman–Ford algorithm is an algorithm that computes shortest paths from a single
source vertex to all of the other vertices in a weighted digraph. It is slower than Dijkstra’s
algorithm for the same problem, but more versatile, as it is capable of handling graphs in
which some of the edge weights are negative numbers. The algorithm was first proposed
by Alfonso Shimbel (1955), but is instead named after Richard Bellman and Lester Ford
Jr., who published it in 1958 and 1956, respectively.
2.3.2 Algorithm
The pseudocode for Bellman-Ford algorithm is given below: [4]
1 function bellmanFord (G , S )
2 for each vertex V in G
3 distance [ V ] <- infinite
4 previous [ V ] <- NULL
5 distance [ S ] <- 0
6
7 for each vertex V in G
8 for each edge (U , V ) in G
9 tempDistance <- distance [ U ] + edge_weight (U , V )
10 if tempDistance < distance [ V ]
11 distance [ V ] <- tempDistance
12 previous [ V ] <- U
13
14 for each edge (U , V ) in G
15 If distance [ U ] + edge_weight (U , V ) < distance [ V }
16 Error : Negative Cycle Exists
17
18 return distance [] , previous []
19
1 import time
2 import numpy as np
3 import matplotlib . pyplot as plt
4
9
5
6 class Graph :
7
8 def __init__ ( self , vertices ) :
9 self . V = vertices
10 self . graph = []
11 def addEdge ( self , u , v , w ) :
12 self . graph . append ([ u , v , w ])
13 def printArr ( self , dist ) :
14 print ( " Vertex Distance from Source " )
15 for i in range ( self . V ) :
16 print ( " {0}\ t \ t {1} " . format (i , dist [ i ]) )
17 def BellmanFord ( self , src ) :
18 dist = [ float ( " Inf " ) ] * self . V
19 dist [ src ] = 0
20 for _ in range ( self . V - 1) :
21 for u , v , w in self . graph :
22 if dist [ u ] != float ( " Inf " ) and dist [ u ] + w < dist [ v ]:
23 dist [ v ] = dist [ u ] + w
24 for u , v , w in self . graph :
25 if dist [ u ] != float ( " Inf " ) and dist [ u ] + w < dist [ v ]:
26 print ( " Graph contains negative weight cycle " )
27 return
28 self . printArr ( dist )
29 g = Graph (500)
30 matrix = np . random . randint (5 , size =(500 ,500) )
31 # print ( matrix )
32 i = 0
33 for node in matrix :
34 x = 0
35 for index in node :
36 g . addEdge (i , x , index )
37 x += 1
38 i += 1
39
40 while True :
41 try :
42 start_time = time . time ()
43 g . BellmanFord (0)
44 end_time = time . time ()
45 print ( " Total time = " , end_time - start_time )
46 break
47 except :
48 pass
49
50
51 x = [25 , 50 , 100 , 150 , 200]
52 y = [0.15509796142578125 , 0.7905609607696533 , 5.7084784507751465 ,
23.45373249053955 ,72.62042880058289]
53
54 plt . plot (x , y , label = " Time Taken Per n Vertices " , color = " red " , marker = " * " )
55 plt . ylabel ( ' Time Taken By BellmanFord Algorithm ( in seconds ) ')
56 plt . xlabel ( ' Number of Vertices In Graph ')
57 plt . title ( ' BellmanFord Algorithm - Time Analysis ')
58 plt . legend ()
59 plt . show ()
10
2.3.3 Complexity - Theoretical Analysis
The Bellman-Ford algorithm is an algorithm that calculates the shortest paths in a
weighted digraph from one source vertex to all other vertices. Bellman-Ford is also
simpler than Dijkstra and suites well for distributed systems. But time complexity of
Bellman-Ford is O(V*E), which is more than Dijkstra.
11
2.4 Floyd–Warshall Algorithm
Floyd-Warshall Algorithm is an algorithm for finding the shortest path between all the
pairs of vertices in a weighted graph. This algorithm works for both the directed and
undirected weighted graphs. But, it does not work for the graphs with negative cycles
(where the sum of the edges in a cycle is negative).
2.4.2 Algorithm
The pseudocode for Floyd-Warshall algorithm is given below:
1 n = no of vertices
2 A = matrix of dimension n * n
3 for k = 1 to n
4 for i = 1 to n
5 for j = 1 to n
6 Ak [i , j ] = min ( Ak -1[ i , j ] , Ak -1[ i , k ] + Ak -1[ k , j ])
7 return A
8
12
3 import random
4
5 from matplotlib import pyplot as plt
6
7
8 n = 200
9
10 # create a graph
11 def make_graph ( novertices ) :
12 x =[0 ,1 ,2 ,3 ,4 ,5 ,6 ,7 ,8 ,9 ,10 ,999]
13 G =[]
14 for i in range ( novertices ) :
15 row = []
16 for j in range ( novertices ) :
17 row . append ( random . choice ( x ) )
18 G . append ( row )
19
20 return G
21
22 # Algorithm implementation
23 def floyd_warshall (G , nV ) :
24 # defined table
25 distance = list ( map ( lambda i : list ( map ( lambda j : j , i ) ) , G ) )
26 print ( distance )
27 # Adding vertices individually
28 for k in range ( nV ) :
29 for i in range ( nV ) :
30 for j in range ( nV ) :
31 # checks
32 distance [ i ][ j ] = min ( distance [ i ][ j ] , distance [ i ][ k ] +
distance [ k ][ j ])
33
34
35 x_list = []
36 floyd_warshall_list = []
37 for i in range (25 ,200+1 ,25) :
38 G = make_graph ( i )
39 # print ( G )
40
41 start = timeit . default_timer ()
42 floyd_warshall (G , i )
43 end = timeit . default_timer ()
44 floyd_warshall_list . append ( end - start )
45
46 x_list . append ( i )
47
48
49
50 plt . plot ( x_list , floyd_warshall_list , linestyle = " solid " , marker = " * " ,c = " blue "
, label = " Floyd - Warshall - Time Analysis " )
51 plt . title ( " Floyd - Warshall - Time Analysis " )
52 plt . legend ([ ' Time Taken Per n Nodes '] , loc = ' upper left ')
53 plt . xlabel ( " Number of nodes in Graph " )
54 plt . ylabel ( " Time taken by Floyd - Warshall Algorithm ( in seconds ) " )
55 plt . show ()
13
2.4.3 Complexity - Theoretical Analysis
The time Complexity of the Floyd-Warshall Algorithm is O(n3 ) or O(|V |3 ),such that V
is the number of vertices or nodes in the graph. The run time has 3 as exponent because
there are three nested loops in the algorithm. This makes it less efficient than Bellman
ford algorithm that has a run time of O(|V | ∗ |E|)
14
2.5 Fibonacci Heap Algorithm
In computer science, a Fibonacci heap is a data structure for priority queue operations,
consisting of a collection of heap-ordered trees. It has a better amortized running time
than many other priority queue data structures including the binary heap and binomial
heap. Michael L. Fredman and Robert E. Tarjan developed Fibonacci heaps in 1984
and published them in a scientific journal in 1987. Fibonacci heaps are named after
the Fibonacci numbers, which are used in their running time analysis. Using Fibonacci
heaps for priority queues improves the asymptotic running time of important algorithms,
such as Dijkstra’s algorithm for computing the shortest path between two nodes in a
graph, compared to the same algorithm using other slower priority queue data structures.
[8]
Find Minimum: The linked list has pointers and keeps track of the minimum node, so
finding the minimum is simple and can be done in constant time.
Union: Union of two Fibonacci heaps consists of following steps. Concatenate the roots
of both the heaps. Update min by selecting a minimum key from the new root lists.
Extract Min: It is the most important operation on a Fibonacci heap. In this operation,
the node with minimum value is removed from the heap and the tree is re-adjusted.
Deleting the minimum element is done in three steps. The node is removed from the root
list and the node’s children are added to the root list. Next, the minimum element is
updated if needed. Finally, consolidate the trees so that there are no repeated orders. If
any consolidation occurred, make sure to update the minimum element if needed. Delaying
consolidation saves times. [9]
2.5.2 Algorithm
The algorithm is referred from [9]
15
9 Fibonacci - Heap - Link (H ,y , x )
10 remove y from the root list of H
11 make y a child of x
12 degree [ x ] := degree [ x ] + 1
13 mark [ y ] := FALSE
14
15 CONSOLIDATE ( H )
16 for i :=0 to D ( n [ H ])
17 Do A [ i ] := NIL
18 for each node w in the root list of H
19 do x := w
20 d := degree [ x ]
21 while A [ d ] <> NIL
22 do y := A [ d ]
23 if key [ x ] > key [ y ]
24 then exchange x < - > y
25 Fibonacci - Heap - Link (H , y , x )
26 A [ d ]:= NIL
27 d := d +1
28 A [ d ]:= x
29 min [ H ]:= NIL
30 for i :=0 to D ( n [ H ])
31 do if A [ i ] < > NIL
32 then add A [ i ] to the root list of H
33 if min [ H ] = NIL or key [ A [ i ]] < key [ min [ H ]]
34 then min [ H ]:= A [ i ]
35
36 Fibonacci - Heap - Union ( H1 , H2 )
37 H := Make - Fibonacci - Heap ()
38 min [ H ] := min [ H1 ]
39 Concatenate the root list of H2 with the root list of H
40 if ( min [ H1 ] = NIL ) or ( min [ H2 ] <> NIL and min [ H2 ] < min [ H1 ])
41 then min [ H ] := min [ H2 ]
42 n [ H ] := n [ H1 ] + n [ H2 ]
43 free the objects H1 and H2
44 return H
45
46
47 Fibonacci - Heap - Insert (H , x )
48 degree [ x ] := 0
49 p [ x ] := NIL
50 child [ x ] := NIL
51 left [ x ] := x
52 right [ x ] := x
53 mark [ x ] := FALSE
54 concatenate the root list containing x with root list H
55 if min [ H ] = NIL or key [ x ] < key [ min [ H ]]
56 then min [ H ] := x
57 n [ H ]:= n [ H ]+1
58
59 Fibonacci - Heap - Extract - Min (H )
60 z := min [ H ]
61 if x <> NIL
62 then for each child x of z
63 do add x to the root list of H
64 p [ x ]:= NIL
65 remove z from the root list of H
66 if z = right [ z]
16
67 then min [ H ]:= NIL
68 else min [ H ]:= right [ z ]
69 CONSOLIDATE ( H )
70 n [ H ] := n [ H ] -1
71 return z
72
1 import math
2
3 # Creating fibonacci tree
4 class FibonacciTree :
5 def __init__ ( self , value ) :
6 self . value = value
7 self . child = []
8 self . order = 0
9
10 # Adding tree at the end of the tree
11 def add_at_end ( self , t ) :
12 self . child . append ( t )
13 self . order = self . order + 1
14
15
16 # Creating Fibonacci heap
17 class FibonacciHeap :
18 def __init__ ( self ) :
19 self . trees = []
17
20 self . least = None
21 self . count = 0
22
23 # Insert a node
24 def insert_node ( self , value ) :
25 new_tree = FibonacciTree ( value )
26 self . trees . append ( new_tree )
27 if ( self . least is None or value < self . least . value ) :
28 self . least = new_tree
29 self . count = self . count + 1
30
31 # Get minimum value
32 def get_min ( self ) :
33 if self . least is None :
34 return None
35 return self . least . value
36
37 # Extract the minimum value
38 def extract_min ( self ) :
39 smallest = self . least
40 if smallest is not None :
41 for child in smallest . child :
42 self . trees . append ( child )
43 self . trees . remove ( smallest )
44 if self . trees == []:
45 self . least = None
46 else :
47 self . least = self . trees [0]
48 self . consolidate ()
49 self . count = self . count - 1
50 return smallest . value
51
52 # Consolidate the tree
53 def consolidate ( self ) :
54 aux = ( floor_log ( self . count ) + 1) * [ None ]
55
18
78 def floor_log ( x ) :
79 return math . frexp ( x ) [1] - 1
80
81
82 fibonacci_heap = FibonacciHeap ()
83
1 import sys
2 import fibonnaciheap
3 import time
4 import numpy as np
5 import matplotlib . pyplot as plt
6
7 def addNodes (G , nodes ) :
8 for i in nodes :
9 G [ i ]=[]
10 return G
11
12 def addEdges (G , edges , directed = False ) :
13 for u ,v , w in edges :
14 if directed == True :
15 G [ u ]. append (( v , w ) )
16 else :
17 G [ u ]. append (( v , w ) )
18 G [ v ]. append (( u , w ) )
19 return G
20
21 def adj_matrix_to_Graph ( M ) :
22 nodes =[]
23 for i in range ( len ( M ) ) :
24 nodes . append ( i )
25 edges =[]
26 for i in range ( len ( M ) ) :
27 for j in range ( len ( M [ i ]) ) :
28 if M [ i ][ j ]!=0:
29 edges . append (( i ,j , M [ i ][ j ]) )
30 G ={}
31 G = addNodes (G , nodes )
32 G = addEdges (G , edges )
33 return G
34
35 class Graph () :
36 def __init__ ( self , vertices ) :
37 self . V = vertices
38 self . graph = {}
39 def min_distance ( self , distance , traversed ) :
40 min_index = 0
19
41 min_value = sys . maxsize
42 for i in range ( self . V ) :
43 if traversed [ i ] is False and min_value > distance [ i ]:
44 min_value = distance [ i ]
45 min_index = i
46 return min_index
47 def dijkstra ( self , src , to ) :
48 list_of_nodes = list ( self . graph )
49 inf = float ( ' inf ')
50 cost ={}
51 cost [ src ]=0
52 known ={}
53 Q = fibonnaciheap . FibonacciHeap ()
54 Q . insert_node ( src )
55 for node in self . graph :
56 if node != src :
57 cost [ node ]= inf
58 known [ node ]= False
59 Q . insert_node ( src )
60 while Q . least != None :
61 v = Q . extract_min ()
62 for u in self . graph [ v ]:
63 print ( u )
64 distance = cost [ v ]+ u [1]
65 if distance < cost [ u [0]]:
66 cost [ u [0]]= distance
67
68 return cost [ to ]
69
70 x = []
71 y = []
72
73 for i in range (25 ,200+1 ,25) :
74 g = Graph ( i )
75 res = np . random . randint (5 , size =( i , i ) )
76 print ( res )
77 # g . graph = { idx : res [ idx ] for idx in range ( len ( res ) ) }
78 g . graph = adj_matrix_to_Graph ( res )
79 print ( g . graph )
80 start_time = time . time ()
81 g . dijkstra (0 ,i -1)
82 end_time = time . time ()
83 x . append ( i )
84 y . append ( end_time - start_time )
85
86
87 # plotting points as a scatter plot
88 plt . plot (x , y , label = " Time Taken Per n Vertices " , color = " red " ,
89 marker = " * " )
90
91 # y - axis label
92 plt . ylabel ( ' Time Taken By Dijkstra ( in seconds ) ')
93 # x - axis label
94 plt . xlabel ( ' Number of Vertices In The Graph ')
95 # plot title
96 plt . title ( ' Dijkstra Algorithm using Fibonnaci Heap - Time Analysis ')
97 # showing legend
98 plt . legend ()
20
99
100 # function to show the plot
101 plt . show ()
21
Chapter 3
Empirical Analysis
To implement BFS we used an adjacency matrix approach and monitored their perfor-
mance on graphs with varying numbers of nodes to determine their time complexity.
With regards to the implementations of Dijkstra and Bellman Ford we created graphs
in Python using the Numpy library and generated random n by n matrices using the
Numpy random built-in function that is numpy.random.randint(range of value, size of
matrix). We assumed that all of these randomly generated values were positive in order
to ensure compatibility with Dijkstra and Bellman-Ford algorithms and the absence of
negative cycles. We used Python’s native time library to record the time it took for Dijk-
stra and Bellman Fords algorithms to find the shortest path from the selected node to all
other nodes in the graphs. We displayed these data and analyzed the time complexities
by comparing the empirical value to theoretical values by plotting the results for a range
of node counts, n, from 25 to 5000.
For Floyd-Warshall we created an adjacency matrix whose size gradually increased through
increasing the number of nodes. The matrix was initialized with non-negative weights that
were attached to each edge. Using present libraries we run and executed the algorithm
and represent the run time and number of nodes on the graph to analyze time complexity
and compare theoretical and empirical values.
22
Figure 3.1: Graph of BFS Algorithm time analysis
23
Figure 3.3: Graph of Dijkstra Algorithm time analysis
Figure 3.4: Graph of Dijkstra Algorithm using Fibonacci Heap time analysis
24
Figure 3.5: Graph of Bellman-Ford Algorithm time analysis
25
Figure 3.7: Comparison of all Algorithm time analysis
3.1 Results
BFS calculates the shortest paths in unweighted graphs.On the other hand, Dijkstra’s
algorithm calculates the same thing in weighted graphs.BFS runs in O(E+V), while Dijk-
stra’s runs in O((V+E)*log(V)). As is clear from the performance graphs above, Djikistra
performs significantly better than BFS.
We discovered that Dijkstra performed flawlessly in our empirical time analysis, taking
roughly linear time to identify the shortest path between n nodes, as depicted in the graph.
We achieved a nearly quadratic graph for Bellman Ford’s Algorithm due to the constraints
we imposed on our input matrices, that insured that each of nodes was linked to at least
single edge, implying that the edges number E is at least V+1. Thus, we can say that this
graph empirically satisfies our theoretical complexity because our theoretical complexity
has been reduced from O(V*E) to O(V2 ).
The Floyd-Warshall algorithm comes close to O(V 3 ) in the practical model. The matrice
has no negative weighted edges and they are randomly assigned.
Hence we conclude that Dijkstra algorithm is significantly much better than Bellman-Ford
algorithm with regards to time complexity. Although, Dijkstra algorithm doesn’t work
for negative weighted graphs therefore we use Bellman-Ford algorithm to efficiently per-
form the computation for negative weighted graphs. The Bellman ford algorithm is much
better in terms of run time as compared with Floyd-Warshall algorithm. Floyd-Warshall
works with undirected unlike Bellman Ford, thus it is used for undirected graphs over
26
Bellman Ford despite Bellman Ford having a better runtime.
We finally plotted the run time graphs of all the algorithms chosen for our project to be
able to analyse and compare their overall trends As we can see in figure 3.7 Dijkstra and
BFS have completely overlapping lines indicating how close in performance they are but
Dijkstra is significantly better than BFS according to our time complexity analysis.
Next we see Dijkstra’s implementation with a Fibonacci heap falls just above Dijkstra’s
performance indicating that despite it having the fastest asymptotic time complexity it
doesn’t cause significant improvement in the shortest path problem.
Floyd Warshall on the other hand takes a much longer time than Dijkstra and BFS and
Fibonacci heap combined. Bellman ford takes even longer than Floyd Warshall in solving
the shortest path problem.
27
3.2 Limitations
For our empirical analysis we tried to vary our number of vertices in the graph as greater
than 200. But due to machine limitations we were unable to execute any number greater
than 200.
28
Chapter 4
Theoretical Comparison
BFS finds the shortest path in undirected and directed graph but it will not work on
weighted graphs since the path with the fewest edges may not be the shortest if the edges
it contains are expensive. The Time complexity of BFS is O(V + E) when Adjacency
List is used and O(V 2 ) when Adjacency Matrix is used.
To deal with this limitation Djikstra comes in. The main advantage of Dijkstra’s algo-
rithm is its considerably low complexity, an almost linear complexity. It works correctly
for directed and un-directed graphs, however, when working with negative weights, Dijk-
stra’s algorithm can’t be used. Also, when working with dense graphs, where E is close to
V 2 , if we need to calculate the shortest path between any pair of nodes, using Dijkstra’s
algorithm is not a good option as the complexity will then be O(V 2 log(V)).
The main advantage of the Bellman-Ford algorithm is its capability to handle negative
weights in addition to directed and un-directed graphs. However, the Bellman-Ford al-
gorithm has a considerably larger complexity than Dijkstra’s algorithm. Therefore, Dijk-
stra’s algorithm has more applications, because graphs with negative weights are usually
considered a rare case. As mentioned earlier, the Bellman-Ford algorithm can handle
directed and directed graphs with negative weights. However, it can only handle directed
graphs with negative weights, as long as we don’t have negative cycles. To avoid this
limitation we come to the Floyd-Warshall.
Floyd-Warshall Algorithm is an algorithm for finding the shortest path between all the
pairs of vertices in a weighted graph. This algorithm works for both the directed and
undirected weighted graphs. But, it does not work for the graphs with negative cycles.
29
Chapter 5
Task Distribution
• Lama Imam and Fatima Nasir Khan worked on Floyd Warshall and Fibonacci heap
for Dijkstra
• Lama Imam and Fatima Nasir Khan worked on the presentation and poster for Demo
30
Bibliography
[1] https://techvidvan.com/tutorials/breadth-first-search/#:~:text=
Algorithm%20for%20BFS%3A,visited%20node%20into%20the%20queue.
[2] https://en.wikipedia.org/wiki/Breadth-first_search
[3] http://www.gitta.info/Accessibiliti/en/html/Dijkstra_learningObject1.
html
[4] https://www.programiz.com/dsa/bellman-ford-algorithm
[5] https://www.javatpoint.com/breadth-first-search-algorithm
[6] https://en.wikipedia.org/wiki/Breadth-first_search
[7] https://www.programiz.com/dsa/floyd-warshall-algorithm
[8] https://en.wikipedia.org/wiki/Fibonacci_heap
[9] https://brilliant.org/wiki/fibonacci-heap/
[10] https://kbaile03.github.io/projects/fibo_dijk/fibo_dijk.html
[11] https://www.programiz.com/dsa/fibonacci-heap
31