0% found this document useful (0 votes)
2 views

Algorithms

The document provides an overview of various types of graphs, including undirected and directed graphs, their degrees, and specific types such as regular, complete, and bipartite graphs. It also discusses graph algorithms like Prim's, Kruskal's, Dijkstra's, and Bellman's Ford, along with concepts like spanning trees and graph isomorphism. Additionally, it covers graph representations through adjacency and incidence matrices, as well as traversal methods like depth-first and breadth-first search.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Algorithms

The document provides an overview of various types of graphs, including undirected and directed graphs, their degrees, and specific types such as regular, complete, and bipartite graphs. It also discusses graph algorithms like Prim's, Kruskal's, Dijkstra's, and Bellman's Ford, along with concepts like spanning trees and graph isomorphism. Additionally, it covers graph representations through adjacency and incidence matrices, as well as traversal methods like depth-first and breadth-first search.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

Undirected graph

edges are undirected


edges are unordered pairs
order of vertices is immaterial for describing the edge

v1
/ \
e1/ \e3
/ \
v2 ------ v3
e2

This triangle is an undirected graph

Directed graph
edges are directed or "one way"
edges are ordered pairs
order of vertices is important for describing the edge
also called diagraph

Degree of graph
For undirected graph:
self-loop=2 degrees counts
otherwise degree=number of edges incident with a vertex.
degree for each vertex is calculated.

Directed graph:
there are in and out degree.
in=towards the vertex
out=outside of vertex

Types of graph
regular graph
the degree of all vertices is same
that same degree is called regularity of the graph
and the graph is referred to as regular graph
complete graph
All vertices connected with each other(all are adjacent to each other)
null graph
if edges is empty, then it's null graph
bipartite graph
A graph is bipartite iff the vertices can be partitioned into two sets such that there is no
edge between any pair of vertices in the same set.
complete bipartite graph

Graph isomorphism
The two graphs G1 and G2 are said to be isomorphic graphs if it's possible to redraw any G1
using G2 or G2 using G1.
Properties of isomorphic graphs:

G1 and G2 both have same number of vertices.


G1 and G2 both have same number of edges.
G1 and G2 both have same degree sequences.

Special graphs
weighted graphs
If a graph has a number associated with an edge(called weight), it's called weighted
graph.
walk, trail, path and circuits
walk: alternating list of vertices and edges
trail: walk with no repeated edges.
Circuit: A closed trail is a circuit.
Cycle: Circuit with no repeated vertex is called cycle.
path: walk with no repeated vertex.
connected graphs, disconnected graphs and components
A graph is connected if every vertex is joined to every other vertex by a path. A
disconnected graph is a graph that is not connected. Think of the components of a
graph as “connected pieces” of the graph that are disconnected from each other. A
connected graph has only one component, which is the entire graph
euler path:
visit all edges exactly only once.
Hamiltonian paths and Hamiltonian circuits

Representation of graph
Adjacency Matrix for undirected graph

v1 v2 v3 v4 v5
v1 0 1 1 1 0
v2 1 0 1 0 1
v3 1 1 0 1 1
v4 1 0 1 0 1
v5 0 1 1 1 0

Figure: Adjacency matrix for the above undirected graph.


Adjacency Matrix for directed graph like the below one.
v1 v2 v3 v4
v1 0 1 0 0
v2 0 0 1 0
v3 1 0 0 1
v4 0 1 0 0

The value 1 in the matrix means that there an edge exists between two vertices.
Diagonal elements has value zero generally (unless self-loops).
Adjacency Matrix of an undirected graph is symmetric.

Incidence Matrix
Indicates which edges occur at each vertex. This matrix is drawn for the graph shown below.

AB AC BC CD
A 1 1 0 0
B 1 0 1 0
C 0 1 1 1
D 0 0 0 1
Cost adjacency matrix
Weighted graphs are represented by cost adjacency matrix. If there is a edge between two
vertices, instead of placing "1", the weight of edge is placed.
If there's no connection, instead of placing "0", infinity is placed.

Linked-adjacency matrix
For this matrix will be represented as.
https://old.reddit.com/r/ObsidianMD/comments/1iqhmco/how_to_draw_that_linked_representati
onim_using/

Depth first search


Flow of depth first search

visit the starting node


proceed to follow through the graph until a dead end is reached.
Once a dead end is reached:
we back up along our path & keep checking if we find unvisited adjacent node-->And
continue in that new direction.
The process is said to be completed when we back up to the starting node and all nodes
adjacent to it have been visited.
Example for above graph:
v1-->v2-->v3-->v4-->v7-->v5-->v6
dead end reached.
Go to v5
(Since no unvisited nodes adjacent)
Go to v7
v8
Again down to v7
v7 don't have any non-visited nodes.
Go to v4, v4 has v9.
Visit v9
Dead end reached(all nodes were visited)
We select the node with lower v_i th value.
i.e v4 lesser than v8, so we choose v4 to traverse first.

Breadth first search


Take the same graph for example.
Flow of breadth first search

visit the starting node


first pass: visit all the nodes directly connected to it.
second pass: visit the nodes that are 2 edges away from the starting node.
second+1 pass: with each new pass, we visit nodes that are one more edge away.
We keep visited nodes array, since in graph there are cycles and it can cause multiple paths
to exist.
Once a dead end is reached:
we back up along our path & keep checking if we find unvisited adjacent node-->And
continue in that new direction.
The process is said to be completed when we back up to the starting node and all nodes
adjacent to it have been visited.

Graphs further discussion


This graph shown above is:

connected
if there is a path between every pair of vertices
path=alternating sequences of edge and vertices?
weighted
undirected
They're not acyclic.

Tree
connected
undirected
acyclic
graph

Graph-->Subgraph
When we remove edges from a connected, weighted, undirected graph G to form a subgraph
such that connectivity is still satisfied & the sum of the weights on the remaining edges is as
small as possible.
Such a subgraph must be a tree.
Example: Take a cycle [v3,v4,v5,v3] , if we remove v4 & v5 , it'd still be connected graph &
could have even lesser minimum weight.

Spanning tree
is a connected subgraph
contains all the vertices in G
and is a tree.

Minimum Spanning Tree


spanning tree
with minimum weight

Prim's algorithm
“Algorithms-1740123188608.png” could not be found.
This graph can be represented as

v1 v2 v3 v4 v5
v1 0 1 3 ∞ ∞

v2 1 0 3 6 ∞

v3 3 3 0 4 2
v4 ∞ 6 4 0 5
v5 ∞ ∞ 2 5 0

⎧Weight on edge, if there’s an edge between v i and v j

W [i][j] = ⎨∞, if there’s no edge between v i and v j



0, if i = j

Initialize the weight or distance as per this above equation.


Y is the set of vertices initialized with arbitrary vertex v1 .

Y={v1}
while(the instance is not solved){
select a vertex in (V-Y) that is nearest to Y;
add the vertex to Y;
if(Y==V)
the instance is solved;
}

In above example, start with v1.


Add the nearest vertex v2(distance=1). Y={v1,v2}
Now, select the nearest vertex to Y i.e both v1 and v2. It's v3 from v1. Y={v1,v2,v3}
Now v5 is that vertex which is nearest to all of v1,v2,v3. Y={v1,v2,v3,v5}.
Now add v4 from v2. Y={v1,v2,v3,v5,v4}.
All vertices are traversed. The minimum spanning tree is indicated by blue lines.

“Algorithms-1740129540915.png” could not be found.

Kruskal's algorithm
Take the same graph as above.
Sort the edges in increasing order of weight.
Keep combining them one after another without forming a loop.
Dijkstra's shortest path
“Algorithms-1740315940677.png” could not be found.

(2) (3) (4) (5) (6)


(1) 3/(1) 6/(1) nil nil nil
(2) 3/(1) (3+2)<(6)->5/(2) 3+5=8/(2) nil nil
(3) 3/(1) 5/(2) 5+3=8/(3)(Same as above) 7/(3) nil
(5) 3/(1) 5/(2) 8/(3)(Because 7+6>8) 7/(3) 12/(5)
(6)

Iteration 1: We see the smallest path is (2). So, now traverse everything that can be reached
from 2 directly in next iteration.
3/(1) is read as distance of 3 from 1.
Iteration 3: Note: (3)(This is a vertex that's why enclosed within parentheses).
The shortest path in iteration 2nd (besides the already visited vertex (2)) is (3).
Thus, the shortest path is (1)(2)(3)(5)(6) and it's of length 12.

Bellman's ford algorithm


Negative cycles
a->b weight -10
b->c weight 1
c->a weight 2
This is a negative cycle because the edge weights sum to a negative number. The negative
number will keep on decreasing -7 -8 -10 ... And goes till −∞
Thus, Shortest paths aren't defined when there are negative cycles.

Bellman-Ford can handle negative edges but not negative cycles.

Initialization of bellman-ford
d(s) = 0

For all other vertices v,v ≠ s


d[v] = ∞

All the shortest paths to all nodes except the source are set to infinity.

Relaxation
After initialization, every edge is considered for relaxation. Relaxation means to check whether
the path that is pointed by the edge can be shortened.
if d(u)+cost(u, v) < d(v)
d(v) = d(u) + cost(u, v)

After first iteration, shortest path among the paths from s to all its immediate neighbors that are
one hop away is updated.
After second iteration, all the vertices that are connected to s by two hops are updated.
This process continues n − 1 times.

Check for negative edge cycle


For all edges do this:
For edge v1v2
Calculate d(v2) > d(v1)+v1v2
If that is greater is true, then there are negative cycles in the graph, Otherwise not.
“Algorithms-1740551004616.png” could not be found.

Find the shortest path from v1 to v5.


Iteration 01:
Distance to source vertex is set to zero. For each other vertices, it's set to ∞.

v1 v2 v3 v4 v5
d 0 nil nil nil nil
pi / / / / /

Iteration 02:
Relax every edge one hop from v1.
i.e.
Relax v1v2, v1v3, v1v4.

v1 v2 v3 v4 v5
d 0 3 4 1 nil
pi / v1 v1 v1 /

How do we get here?


Find d v2

d = 3 Note we're looking only into one hop away.


v1,v2
Find d . It's 4 from v1
v3

Same goes for v . 4

Iteration 03:
Relax every edge two hop away from v1.
i.e. Relax v2v3,v2v5,v3v4,v4v5.

v1 v2 v3 v4 v5
d 0 3 -4 -3 3
pi / v1 v2 v3 v4

To reach v3, we can either


v1-v2+v2-v3 OR v1-v3
The shortest of both is -1 which is from v2
So reach v3 from v2 instead of v1
To reach v4, we can either
v1-v4 OR v1-v3+v3-v4
Go from v3 as both are equals.
To reach v5; v2-v5 vs v2-v3-v4-v5 vs v4-v5
Choose the second last path.

Transitive closure
It finds out all paths in a graph. i.e. the connectivity of a graph.
The meaningful question about connectivity is: What vertices can you reach if you start on a
particular vertex?

Warshall's algorithm
If you can get from vertex L to vertex M, and you can get from M to N, then you can get from L
to N.
Step 01: Draw the adjacency matrix

A B C D E
A 0 0 1 0 0
B 1 0 0 0 1
C 0 0 0 0 0
D 0 0 0 0 1
E 0 0 1 0 0

Row A
Go to each column till you find 1.
Found 1 at C.
A->C there's a path.
That means if X->A exists, then X->C exists by transitivity.
To find X->A exists, see column A and find a value which is 1.
B->A is 1.
It means B->A exists, A->C exists thus B->C exists as well.

A B C D E
A 0 0 1 0 0
B 1 0 01 0 1
C 0 0 0 0 0
D 0 0 0 0 1
E 0 0 1 0 0
Row B
Go to each column till you find 1.
B->A there's a path.
That means if X->B exists, then X->A exists by transitivity.
To find X->B exists, see column B and find a value which is 1. There's no 1 in B.
Move ahead.
B->C there's a path.
But since there's no path from X->B, we don't need to check any further.

Row C
No path

Row D
D->E exists.
That means if X->D exists, then X->E exists by transitivity.
To find whether X->D exists, see column D and find a value which is 1. None. Thus skip.

Row E
E->C exists.
That means if X->E exists, then X->C exists by transitivity.
To find whether X->E exists, see column E and find a value which is 1. It's B and D
D->E exists
E->C exists
Thus D->C exists.
B->E exists
E->C exists
B->C exists (it's already been generated).

Tree
hierarchical relationship is best explained by "tree".
non-linear data structure.

A tree having m nodes has exactly m-1 edges or


branches
A tree with m nodes can be thought as of tree with m vertices.
A tree with 1 vertices has 0 edge=(1-1)
A tree with 2 vertices has 1 edge=(2-1)
Assume this statement is true:
A tree with k vertices has (k-1) edges.
Prove that: A tree with k+1 vertices has (k+1)-1 edges.
If we add 1 vertex to tree, it is like adding 1 edge.
So, k-1+1=(k+1)-1.
Hence proved.

The maximum number of nodes on level l of a


binary tree is 2^l , l>=0
The maximum number of nodes in a binary tree of
height h is 2^{h+1}-1,h>=0
Binary tree representation
sequential
linked list

Sequential representation of tree

This binary tree can be represented sequentially as

1 L
2 M
3 N
4 O
1 L
5 P
6 Q
7 R
8 S
9 T
10 U
11 V
12 W
13 X
14 Y
15 Z

To find information of node with index i , in the sequential representation, use the formulas
given below.
The parent of i is i/2 . For example, the parent of O is M.
The left child of i is 2i . For example, the left child of 7 (R) is Y(14). If 2i>n , there's no left
child
The right child of i is 2i+1 . For example, the right child of 7(R) is 15(Z). If 2i+1>n , there's no
left child.

Linked List Representation


This looks like the original graph itself.
The left pointer points towards the left child linked list. If there's no left child, it points towards
null.
The right pointer points towards the right child linked list. If no right child, it point towards null.

Tree traversal
pre-order
1st root visit, then left subtree then right subtree
L,M,N
in-order
1st left hand subtree, then root, then right hand subtree
M,L,N
post-order
1st left subtree, then right subtree, then root
M,N,L

Searching in binary tree


Searching in ordered binary tree is explained below.

if(tree is empty)
return not found
if(root node=search target)
return found
if(search target<root)
return btsearch(left-child,search target)
if(search target>root)
return btsearch(right-child, search target)

Insertions in binary search Tree


Search tree is ordered in a certain way and this is what'll come in exam.
“Algorithms-1740749123816.png” could not be found.

Insert 20 in this graph.

20<66
Move left of root node
20<40
Move left of node 40
20<30
Move left of node 30
Place it there.
“Algorithms-1740749181687.png” could not be found.

Deletions in binary search tree


Case 1:
Say you want to delete the node 75.(Node without a child)
Make left pointer of 90 as NULL.
Case 2:
Delete 110, i.e. a node with only 1 child.
Make 90's right pointer to 120 instead of pointing 110.
Case 3:
Needs deep knowledge about in-order traversal
“Algorithms-1740750026994.png” could not be found.

The in-order traversal of this tree will be as follows:


CDEGJKLNPRTUWY
If we want to delete N, we can replaced N with either L or P.
L=rightmost of left subtree
P=leftmost of right subtree

AVL balanced tree


AVL tree is always well-balanced because height of the left and right subtrees of the root do not
differ by more than one
Balance factor:

parameter that is used to decide when to rebalance the tree.


By default,
node's balance factor=number of levels in the node's left and number of levels in the node's
right subtree
For a balanced tree, balance factors are either 1,0 or -1.
If balance factor is +1, it's left high node
If balance factor is 0, it's even high node
If balance factor is -1, it's right high node.

Insertion in AVL tree example


Insert 33 using Binary Search Tree Insertion procedure.

Calculate the balance factor starting from the leaf node.

33: 0
35: 1
47: 0
40:0
50: 3-1=2
50 node is problematic for well balanced tree. To restore balance, right rotation on node 50.
50 goes towards 63
40 goes towards 50
35 goes towards 40
33 goes towards 35
47 is orphaned

Now, put 47 in the 50's left as per binary insertion algorithm.

Huffman algorithm

Character Frequency
a 16
b 5
c 12
d 17
e 10
f 25

Sort them in ascending order.


b,5
e,10
c,12
a,16
d,17
f,25

Take b,e and create a new node n1.


“Algorithms-1740820676485.png” could not be found.

Now remove b & e from the priority queue & add n1 to it.

c,12
n1,15
a,16
d,17
f,25

Take c & n1; repeat the same thing as above.


“Algorithms-1740820824508.png” could not be found.

Remove n1,c from the priority queue & add n2 to the priority queue(And sort it as well).

a,16
d,17
f,25
n2,27

Take a and d. Sum is 33.

Now add n3 to the Priority Queue while removing a & d.


f,25
n2,27
n3,33

Remove f,n2 from priority queue like this. Add n4,52 instead.

Now, priority queue looks like

n3,33
n4,52

Add them up. 33+52=85. n5,85 is made above


For left subtree, put its value as 0, for right subtree put its value as 1
c will be coded as 000.
b will be coded as 0010 and so on.

Insertion sort
30,25,15,20,28
Assume 30 is sorted and rest is unsorted
Insert 25 into sorted array. And then sort it.
25,30|15,20,28
Insert 15 into sorted array. And then sort it.
15,25,30|20,28
Insert 20 into sorted array. And then sort it.
15,20,25,30|28
Insert 28 into sorted array. And then sort it.
15,20,25,28,30

This is final sorted array.

Selection sort
7|4|3|6|5
Iteration 01:
7 vs 4 --> 4
4 vs 3 --> 3
3 vs 6 --> 3
3 vs 5 --> 3
Smallest=3
Swap 3 with 7
3,4,7,6,5
Iteration 02:
4 vs 7 --> 4
4 vs 6 --> 4
4 vs 5 --> 5
Smallest=4
Keep as it is
3,4,7,6,5
Iteration 03:
7 vs 6 --> 6
6 vs 5 --> 5
Swap 7 and 5
3,4,5,6,7
Iteration 04:
6 vs 7 --> 6
Sorted finally.

Merge sort
Shell sort
Sort this:
34,12,22,09,04,60,56,14
Soln:
Take gap=4,
34|xx|xx|xx|04|xx|xx|xx
xx|12|xx|xx|xx|60|xx|xx
xx|xx|22|xx|xx|xx|56|xx
xx|xx|xx|09|xx|xx|xx|14
Sort it line by line
04|xx|xx|xx|34|xx|xx|xx
xx|12|xx|xx|xx|60|xx|xx
xx|xx|22|xx|xx|xx|56|xx
xx|xx|xx|09|xx|xx|xx|14

Take gap=4/2
04|xx|22|xx|34|xx|56|xx
xx|12|xx|09|xx|60|xx|14
xx|xx|xx|xx|xx|xx|xx|xx
xx|xx|xx|xx|xx|xx|xx|xx
Sort it line by line.
04|xx|22|xx|34|xx|56|xx
xx|09|xx|12|xx|14|xx|60
xx|xx|xx|xx|xx|xx|xx|xx
xx|xx|xx|xx|xx|xx|xx|xx

Take gap=2/2
And apply insertion sort
Write the sorted array as the answer.

Heap sort
Heap is essentially complete binary tree, such that:

the values stored at the nodes come from an ordered set.


the value stored at each node is greater than or equal to the values stored at its children.
This is called the heap property.
Essentially complete binary tree:
is a complete binary tree down to a depth of d-1.
complete binary tree is a tree where each leaves have a depth d & all non-leaf nodes
have two children.
The nodes with depth d are as far to the left as possible.
Max Heap
parent is more than or equals to children everywhere.
Min heap
parent is lesser than or equals to children everywhere.
implementation of min-heap as an array
parent(n)=floor(n/2)
left child of n=2n
right child of n=2n+1

Sift up operation on heap


Source of deleted image: Algorithms: Design Techniques and Analysis
Algorithm pseudocode

if(child>parent)
interchange child, parent
Go to next parent(i/2)

Sift down operation on heap

You might also like