Trees and Graphs
Trees and Graphs
Complete tree
Inorder Traversal:
In the case of inorder traversal, the root of each
subtree is visited after its left subtree has been
traversed but before the traversal of its right
subtree begins.
The steps for traversing a binary tree in inorder
traversal are:
Visit the left subtree, using inorder.
Visit the root.
Visit the right subtree, using inorder.
Preorder Traversal:
In a preorder traversal, each node is visited before
its left and right subtrees are traversed.
Preorder search is also called backtracking. The
steps for traversing a binary tree in preorder
traversal are:
Visit the root.
Visit the left subtree, using preorder.
Visit the right subtree, using preorder.
Postorder Traversal:
In a postorder traversal, each root is visited
after its left and right subtrees have been
traversed.
The steps for traversing a binary tree in
postorder traversal are:
Visit the left subtree, using postorder.
Visit the right subtree, using postorder
Visit the root.
Example 1:
Traverse the following binary tree in pre, post and in-
order.
Example 2:
Traverse the following binary tree in pre, post, inorder and
level order.
Example 3
Heap tree
Type of heaps
85 45 45 25
75 25 35 15 55 65 35 75
55 65 85 95 Min heap
Max heap
Priority queue
Priority queue can be implemented using circular
array, linked list etc.
Another simplified implementation is possible
o using heap tree;
o the heap, however, can be represented using
an array. This implementation is therefore
free from the complexities of circular array
and linked list
o but getting the advantages of simplicities of
array.
As heap trees allow the duplicity of data in it.
Elements associated with their priority values are
to be stored in from of heap tree, which can be
formed based on their priority values.
The top priority element that has to be processed
first is at the root; so it can be deleted and heap
can be rebuilt to get the next element to be
processed, and so on.
As an illustration, consider the following processes with
their priorities:
Process P1 P2 P3 P4 P5 P6 P7 P8 P9 P10
Priority 5 4 3 4 5 5 3 2 1 5
Figure 2.6.
Searching a
binary tree
Why use
binary search
trees?
Binary search trees provide an efficient way to
search through an ordered collection of items.
Consider the alternative of searching an ordered
list.
The search must proceed sequentially from one
end of the list to the other.
o On average, n/2 nodes must be compared for
an ordered list that contains n nodes.
In the worst case, all n nodes might need
to be compared.
For a large collection of items, this
can get very expensive.
The inefficiency is due to the one-dimensionality
of a linked list.
We would like to have a way to jump into the
middle of the list, in order to speed up the search
process.
In essence, that’s what a binary search tree does.
The longest path we will ever have to search is
equal to the height of the tree.
The efficiency of a binary search tree thus
depends on the height of the tree.
o For a tree holding n nodes, the smallest
possible height is log(n).
o For 2 nodes –height is 1, 4-2, 8-3 etc
To obtain the smallest height, a tree must be balanced,
Balanced Trees:
For maximum efficiency, a binary search tree
should be balanced.
Every un-balanced trees are referred to as
degenerate trees, so called because their
searching performance degenerates to that of
linked lists.
Figure 2.14 shows an example.
Balancing Acts:
There are two basic ways used to keep trees balanced.
Use tree rotations.
Allow nodes to have more than two children.
Tree Rotations:
Certain types of tree-restructurings, known as rotations,
can aid in balancing trees.
Figure 2.16 shows two types of single rotations.
Figure 2.16. Single Rotations
Another type of rotation is known as a double
rotation. Figure 2.17 shows the two symmetrical
cases.
b a
b
a c
c
AVL Tree
An AVL tree is a height balanced Binary Search Tree.
The number of null branches is more in a normal BST if the
elements are almost in order, this leads to more levels and in
turn need more space.
This problem is solved by balancing the height whenever a
node is inserted into an AVL tree.
The re-balancing is recommended based on the balancing
factor.
Balancing factor
B-trees
Graphs
Graph G is a pair (V, E), where V is a finite set
(set of vertices) and E is a finite set of pairs from
V (set of edges). We will often denote n := |V|,
m := |E|.
Graph G can be directed, if E consists of ordered
pairs, or undirected, if E consists of unordered
pairs. If (u, v) E, then vertices u, and v are
adjacent.
We can assign weight function to the edges:
wG(e) is a weight of edge e E. The graph which
has such function assigned is called weighted.
Degree of a vertex v is the number of vertices u
for which (u, v) E (denote deg(v)). The number
of incoming edges to a vertex v is called in–
degree of the vertex (denote indeg(v)). The
number of outgoing edges from a vertex is called
out-degree (denote outdeg(v)).
Definition: The degree of a vertex in an
undirected graph is the number of edges
incident with it, except that a loop at a vertex
contributes twice to the degree of that vertex.
In other words, you can determine the degree of
a vertex in a displayed graph by counting the
lines that touch it.
The degree of the vertex v is denoted by deg(v).
Example: What are the in degrees and out degrees of the vertices a, b,
c, d in this graph: in-out-
Representation Digraphs
Just like graphs, digraphs can be represented
using.
o Linear list
o Adjacently matrix
Adjacency matrix VS adjacent list
adjacency matrix store the rows and column in
a table format
comparing the matrix and the list,
o An adjacency matrix requires Θ (n2)
storage while an adjacency list requires Θ
(n + e) storage.
However
Adjacency matrices allow faster access to
edge queries (for example, is (u, v) E)
While
Co
nn
ect
ivit
y
Disadvantages:
It works very fine when search graphs are trees
or lattices, but can get struck in an infinite loop
on graphs.
o This is because depth first search can travel
around a cycle in the graph forever.
o To eliminate this keep a list of states
previously visited, and never permit
search to return to any of them.
One more problem is that, the state space tree
may be of infinite depth,
to prevent consideration of paths that are too
long, a maximum is often placed on the depth
of nodes to be expanded, and any node at that
depth is treated as if it had no successors.
We cannot come up with shortest solution to the
problem.
real-world examples:
One practical application of a MST would be in the
design of a network.
For instance, a group of individuals, who are
separated by varying distances, wish to be
connected together in a telephone network.
Although MST cannot do anything about the
distance from one connection to another, it can be used
to determine the least cost paths with no cycles in this
network, thereby connecting everyone at a minimum
cost.
Another useful application of MST would be finding
airline routes.
The vertices of the graph would represent
cities, and the edges would represent routes
between the cities.
Obviously, the further one has to travel, the more it will
cost, so MST can be applied to optimize airline routes by
finding the least costly paths with no cycles
Kruskal’s Algorithm
This is a greedy algorithm. A greedy algorithm chooses some
local optimum (i.e.picking an edge with the least weight in a
MST).
Kruskal's algorithm works as follows:
o Take a graph with 'n' vertices, keep on adding the
shortest (least cost) edge, while avoiding the creation of
cycles, until (n - 1) edges have been added.
o Sometimes two or more edges may have the same cost.
The order in which the edges are chosen, in this
case, does not matter.
o Different MSTs may result, but they will all have the
same total cost, which will always be the minimum cost.
Example
In case of parallel edges, keep the one which has the least cost
associated and remove all others.
The least cost is 2 and edges involved are B,D and D,T.
We add them. Adding them does not violate spanning tree
properties, so we continue to our next edge selection.
Next cost is 3, and associated edges are A,C and C,D. We add them
again −
Next cost in the table is 4, and we observe that adding it will create
a circuit in the graph. −
Now we are left with only one node to be added. Between the two
least cost edges available 7 and 8, we shall add the edge with cost
7.
By adding edge S,A we have included all the nodes of the graph
and we now have minimum cost spanning tree.
Example
The vertices included in MST are shown in green color.
Pick the vertex with minimum key value and not already
included in MST (not in mstSET).
The vertex 1 is picked and added to mstSet.
o So mstSet now becomes {0, 1}. Update the key values
of adjacent vertices of 1. The key value of vertex 2
becomes 8.
Pick the vertex with minimum key value and not already included in
MST (not in mstSET).
o We can either pick vertex 7 or vertex 2, let vertex 7 is picked. So
mstSet now becomes {0, 1, 7}.
o Update the key values of adjacent vertices of 7. The key value of
vertex 6 and 8 becomes finite (1 and 7 respectively).
Pick the vertex with minimum key value and not already included in
MST (not in mstSET). Vertex 6 is picked.
o So mstSet now becomes {0, 1, 7, 6}. Update the key values of
adjacent vertices of 6. The key value of vertex 5 and 8 are
updated.
We repeat the above steps until mstSet includes all vertices of given
graph. Finally, we get the following graph.
Example