0% found this document useful (0 votes)
127 views107 pages

Unit IV

Unit IV covers topics related to trees including binary trees, binary search trees, and balanced trees. Key concepts discussed include tree traversal algorithms, representing trees in memory, searching and modifying binary search trees, and balanced tree variations like AVL and red-black trees.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
127 views107 pages

Unit IV

Unit IV covers topics related to trees including binary trees, binary search trees, and balanced trees. Key concepts discussed include tree traversal algorithms, representing trees in memory, searching and modifying binary search trees, and balanced tree variations like AVL and red-black trees.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 107

Unit IV- Trees

Introduction, Binary trees, Representing binary trees


in memory, traversing binary trees, traversal
algorithm using stack, threads(Threaded binary
trees) Binary search trees, Searching, Inserting and
deleting in binary search trees, Heap, Heapsort.

Balanced Trees: AVL trees, AVL Balance factor,


Balancing trees, AVL node structure ,AVL Insert, AVL
Delete walkthrough with examples, Other balanced
tree variations (Splay trees and Red-Black trees).
Basic tree concepts
Definition: A tree consists of a finite set of
elements called nodes and a finite set of
directed lines called branches that connect
the nodes.
It is a nonlinear data structure where the
nodes are represented hierarchically.
Example:
Degree of a node: It is the number of branches
associated with the node
Indegree: Branch is directed towards the node
Outdegree: Branch is directed away from the node
Degree = Indegree + Outdegree
Root: if the tree is non empty, then the first node is
called the root of the tree. It’s indegree is zero.
• Except the root, all the nodes in a tree must have an
indegree of exactly one.
• Outdegree of a node can be zero, one or more.
• Leaf : Any node with an outdegree of zero. It is also
called a terminal node.
• Internal node: A node which is neither a root nor a
leaf is known as an internal node. It is found in the
middle portion of the tree.
Parent: A node is a parent if it has a successor nodes. (
that is outdegree must be > 0)
Child node: A node is a child node if it has predecessor
(i.e. indegree=1).
Siblings: Two or more nodes with the same parent are
called siblings.
Path: It is a sequence of nodes in which each node is
adjacent to the next one.
Branch: A path ending with a leaf is called a branch.
• Level: Level of a node is it’s distance from the
root. Root has a distance of zero from itself.
Therefore level of root=0. Siblings are always
at the same level.
• Height: Height of a tree is the level of the leaf
in the longest path from the root +1.
Subtrees: Trees can be divided into subtrees. A Subtree is
any connected structure below the root. The first node of
a subtree is called root of the subtree.This definition of
subtree leads to the recursive definition.

Definition:
• A tree is a set of nodes that is either empty or
has a designated node called the root from which
hierarchically descend zero or more subtrees, which are
also trees.

Example:
Binary Trees
A binary tree is a tree in which no node can have
more than two subtrees . That is a node can have
zero, one or two subtrees which are called left
subtree and a right subtree.
Each subtree is a binary tree by itself.
Complete tree: A complete tree has the
maximum number of entries for it’s height. It is
reached when the last level is full. A tree is
considered nearly complete if all the nodes in
the last level are found on the left.
Memory Representation

Two ways:
1. Array representation
2. Linked list representation
Array representation

Root is stored at Tree[1]


If a node occupies Tree[k], , then it’s left child
is stored in Tree[k*2] and right child is stored
at Tree[k*2+1]

Example:
Disadvantage: More memory is needed to
represent the tree.
Linked list representation
• Uses three parallel arrays (INFO, LEFT and
RIGHT)
• Pointer variable ROOT which will contain the
location of the root R of tree, T
• INFO[K] contains the data at the node N
• LEFT[K] contains the location of the left child
of N
• RIGHT[K] contains the location of the right
child of N
Advantage:
1. Direct access to the root as well as children
node.
2. Memory is not wasted.

Info[k] , left[k] and right[k] will give the kth value


Binary tree Traversals
Each node is traversed exactly once.
Two approaches are there
1. BFS (Breadth First Search)
2. DFS (Depth First Search)
BFS Traversals: Here the processing proceeds
horizontally from the root to all it’s children, then
to it’s children's children. Each level is completely
processed before the next level.
DFS Traversals: Here, we process all the
descendants of a child before going on to the
next child. It happens along the path.
There are three standard traversals possible-
Preorder, Inorder and Postorder.
Pre order Traversal(NLR)
Algorithm Preorder (val root <node pointer>)
This algorithm traverses a binary tree in node-left-
right sequence
Pre: Root is the entry node of the tree or subtree.
Post: Each node has been processed in preorder.
1. If (root is not null)
1. Process (Root)
2. Preorder (Root->left subtree)
3. Preorder (Root->right subtree)
2. Endif
3. Return
End Preorder

Example:
Inorder Traversal(LNR)
Algorithm Inorder(val root <node pointer>)
This algorithm traverses a binary tree in left-node
right sequence
Pre: Root is the entry node of the tree or subtree
Post: Each node has been processed in inorder
1. If (root is not null)
1. Inorder (Root->left subtree)
2. Process (Root)
3. Inorder (Root->right subtree)
2. Endif
3. Return
End Inorder
Example:
Post order Traversal(LRN)
Algorithm Postorder (val root <node pointer>)
This algorithm traverses a binary tree in left- right
node sequence
Pre: Root is the entry node of the tree or subtree.
Post: Each node has been processed in postorder.
If (root is not null)
1. Postorder (Root->left subtree)
2. Postorder (Root->right subtree)
3. Process (Root)
2. Endif
3. Return
End Postorder
Example:
Expression trees
One of the applications of a binary tree is expression
tree. An expression tree is a binary tree with the
following properties.
– Each leaf is an operand
– The root and the internal nodes are operators.
– Subtrees are subexpressions with the root being
an operator.
A given arithmetic expression can be represented in the
form of a tree.
Ex. (a * (b + c) )+ d
Expression trees
+

a +

b c d
Expression trees

Advantage: Three different expression (infix, postfix


and prefix) can be easily obtained just by traversing a
tree
Ex: (a-b)/((c*d)+e) Draw a tree, and derive its prefix and
postfix expression.
Expression trees
/

- +

a b *

c d e
p+(q*r)/((s-t*u)^a)/b /

+ -

* *

P q r s t u a b
Problems
1.The following prefix expression is given. Draw
the expression tree and find the infix and
postfix expressions.
*-AB+* CD/ EF
2. The following postfix expression is given.
Draw the expression tree and find the infix
and prefix expressions.
AB * CD/+ EF-*
Problems
Given
Preorder : A B D E C F
Inorder: D B E A C F
Construct the binary tree.
Problems
Given
Preorder: AB D E F C G H J L K
Inorder: D B F E A G C L J H K
Construct the binary tree. What is the balance
factor?
Problems
Given
Preorder : AB D E F G H I C J K L
Inorder: D B F H G I E A C K J L
Construct the binary tree and also derive its post
order.
Problems
Given

Pre: G B Q A C K F P D E R H
In: Q B K C F A G P E D H R
Problems
• A binary tree has 10 nodes. The inorder and
preorder traversal of the tree are shown
below. Draw the tree.
• Preorder: JCBADEFIGH
• Inorder: ABCEDFJGIH
Breadth First Traversal

Algorithm breadthFirst ( Val root <node pointer>)


This algorithm process tree using breadth – First
traversal
Pre root is a pointer to tree node
Post tree has been processed
1 . pointer = root
2. Loop ( pointer not null )
1. process ( pointer)
2. if ( pointer -> left not null )
1. enqueue ( pointer -> left)
3. if ( pointer -> right not null )
1. enqueue ( pointer -> right)
Breadth First Traversal

4. if (not emptyQueue)
1. dequeue ( pointer)
5. else
1. pointer = null
3 return
end breadthFirst
Binary Search tree
Definition
A Binary Search tree is a Binary tree with the
following properties.
• All items on the LST are less than the root
• All items on the RST are greater or equal to
the root.
• Each Subtree is a binary search tree itself.
Why a BST ?
Array :Efficient for Binary search, but expensive
for insertion/deletion
Linked list: Efficient for insertion/deletion but
inefficient for search

Solution: Binary Search Tree.


Example
Construct a binary search tree for the following
numbers.

50 38 44 22 77 35 60 90
Operations on BST
BST Traversals: Three kinds of traversals are
possible. Here we are interested in results of
traversals(Like In order method).
Example:
Consider the tree that is constructed and
traverse in inorder.
What do we get???
Ans: Sorted list.

Advantage: Inorder, traversal of a BST produces


an ordered list. Hence very effective in terms
of time complexity.
BST Search Algorithms
There are three cases to handle
1. To find the smallest node ( Available in the
left branch as a leaf node)
2. To find the largest node ( Available in the
right branch as a leaf node)
3. To search a tree for a given value.
Case 1
Algorithm FindsmallestBST
This algorithm finds the smallest node in BST.
Pre: Root is a pointer to the non empty BST or
subtree.
Post: Smallest node is obtained
Return: Address of smallest node
1.If (root->Left=null)
1. Return (Root)
2.endif
3. Return FindsmallestBST(root->Left)
end FindsmallestBST
Case 2
Algorithm FindLargestBST
This algorithm finds the largest node in BST.
Pre: Root is a pointer to the non empty BST or
subtree.
Return: Address of largest node
1. If (root->Right=null)
1. Return (Root)
2. endif
3. Return FindLargestBST(Root->Right)
End FindlargestBST
Case 3
Algorithm SearchBST
This algorithm searches a tree for a given value.
Pre: Root is a pointer to the BST or subtree.
Item is the key value requested.
Return: node address if the value is found.
null if the node is not in the tree.
1. If (root=null)
1. Return null
2. Endif
3. If (item < Root->key)
1.Return SearchBST(root->Left,item)
4. Else if (item > Root->key)
1.Return SearchBST(root->Right,item)
5 Else
1. Return Root
6. End if
End SearhBST
Insertion into a BST
1. We follow the branches to an empty Subtree
and then insert the new node.
2. All inserts take place at a leaf node
3. It is a recursive insert procedure.
Algorithm AddBST
This algorithm inserts a node containing the new
data into a BST using recursion.
Pre: Root is the address of current node in a BST.
New is the address of the node containing
data to be inserted.
Post: New node inserted into a tree.
1. If (root=null)
1. root=new
2. Else //Locate null subtree
1. If (new->key < root->key)
1.AddBST(root->left, new)
2. Else
1. AddBST(root->right, new)
3. Endif
3. End if
4. Return
end AddBST
Deleting a node
To delete a node, we must first locate it. There
are four possible cases.
Case 1: Node has no children - Set the node’s
parent to null and recycle it’s memory and
return.
Case 2: The node to be deleted has only a right
subtree. Here simply attach the right subtree
to the delete node’s parent and recycle it’s
memory.
Case 3:The node deleted has only a left Sub tree.
Attach the left sub tree to the deleted node’s parent and
recycle it’s memory.

Case 4: The node has 2 subtrees– It can be done in two


ways.
1. We can find the largest node in the deleted node’s left
subtree and move it’s data to replace the deleted node’s
data. OR
2. Find the smallest node on the deleted node’s right
subtree and move it’s data to replace the deleted node’s
data.
Algorithm Delete
Algorithm DeleteBST(ref root<pointer>,val
delkey <key>)
This algorithm deletes a node from a BST
Pre Root is a pointer to the tree containing the
data to be deleted, delkey is a key of the node
to be deleted.
Post Node deleted and memory recycled.
Return True if node deleted, false if not found.
If (root = null)
1. return false
2 Endif
3. If (delkey < root->key)
1 return DeleteBST(root->left,delkey)
4 elseif (delkey > root->key)
1 return DeleteBST(Root->right,delkey)
5 else //delete node found and test
for leaf node
1 If (root->left =null)
1. delptr=root
2. root=root->right
3. recycle( delptr)
4. return true
2 Else if (root->right =null)
1. delptr=root
2. root=root->left
3. recycle( delptr)
4. return true
3 else //Node to be deleted not a leaf,
find largest node on left subtree
1. delptr=root->left
2. loop (delptr->right not = null)
1. delptr=delptr->right
3. root->data=delptr->data
4. return deleteBST(Root->left,delptr->data)
4. Endif
6. endif
End deleteBST
Heap Sort
Suppose H is a complete Binary tree with “n”
elements, H is called a heap or Maxheap if each
node “n” of “H” has the following property.
The value at N is >=value at each of the children
of N.
MinHeap:
The value at N is <=value at any of the children of
N is called Minheap.
Example: 50, 45 48 44 42
Memory representation
Heap is maintained in memory in sequential
locations.
Ex: Tree[1]= Store the root
Tree[2 * k]= Left child
Tree[2*k + 1] = right child
Building the heap
Build a heap from the following list of numbers.

44 30 50 22 60 55 77 55
Inserting into a heap
Algorithm Insheap
This algorithm adds a new element into a heap.
Pre: Item is the new item to be added.
Post: Heap is being constructed
1. Set N=N+1, PTR=N //Add new node to H
and initialize PTR
2. Loop (PTR>1) // Find location to insert the item
1. PAR=[PTR/2]
2. if (Item<=Tree[PAR])
1 Tree[PTR]=Item and Return
3. Endif
4. Tree[PTR]=Tree[PAR]
5. PTR=PAR //updating PTR
3. endloop
4. Tree[1]=Item //assigning item as the root of H
5. Return
Deleting the root of the heap
Let H be a heap with “N” Elements and Root is
“R” to be deleted.
1. We assign R to an Item
2. Replace R by the last node L of H so that H is
still a complete binary tree ,but may not be a
heap.
3. Reheap – Replace L into it’s appropriate
place.
Example

If root is to be deleted, then the tree has to be


reheaped.

95 85 70 55 33 30 65 15 20 15 22
Algorithm
Algorithm Delheap
This algorithm assigns the root to a variable called Item
and then reheaps the remaining elements. LAST
saves the value of the original last node of H.
1. Item=Tree[1]
2. LAST=Tree[N] and N=N-1 //Remove the last node
3. PTR=1, Left=2, Right=3 // Initialize
4. Loop (Right<=N)
1.if (Last>=Tree[left] and last>=Tree[Right]
1. Tree[PTR]=Last and return
2.Endif
3. If ( Tree[Right]<=Tree[Left] )
1. Tree[PTR]=Tree[left]
2. PTR=Left
Else
1. Tree[PTR]=Tree[Right]
2. PTR=Right
4. Endif
5. Left=PTR *2 and Right=Left+1
6.endloop
7. If (Left=N and Last<Tree[Left])
1. PTR=Left
8. Tree[PTR]=Last
9. Return
Applications of a heap
1. Application to sorting

Suppose ‘A’ is an array with N elements, the


heapsort algorithm sorts in 2 Phases.
Phase 1: Build a heap H out of the elements of
A.
Phase 2: Repeatedly delete the root element
of H.
Algorithm HeapSort
Algorithm HeapSort
An array A with N elements is given. This
algorithm sorts the elements of A.
1. Loop ( 1 to N-1)
1. call InsHeap(A, J, A[J+1])
End of loop
3. loop (N>1)
1. Call DelHeap( A, N, Item)
2. Set A[N+1] =Item
End of the loop]
4. Exit.
2. Heap as a Priority Queue
A Heap allows a very efficient implementation
of a priority queue since in a ascending heap,
the root of the tree contains smallest element
and processing the node by deleting implies
processing the job with the highest priority
number.
Unit IV part II
• Balanced Trees: AVL trees, AVL Balance factor,
Balancing trees, AVL node structure ,AVL
Insert, AVL Delete walkthrough with examples,
Other balanced tree variations (Splay trees
and Red-Black trees).
AVL Trees
(Height balanced trees)
• Created by Adelson –Velskii and E.M. Landis
• It is a height balanced binary search tree
• Comparison between unbalanced BST and AVL
tree(Time complexity)
• AVL Balacnce Factor: The balance factor for
any node in an AVL tree must be +1, 0 or -1.
• The descriptors LH for Left High(+1) , EH for
Even High(0) and RH for right high(-1)
AVL node structure
Node
Key <keytype>
data <datatype>
LeftSubtree <Pointer to Node>
RightSubtree <Pointer to Node>
bal <LH,EH,RH>
End Node
Balancing trees
• Insertion and deletion may result in
unbalanced trees.
• Rebalancing is required
• Rebalancing is done by rotating nodes either
to the left or to the right
• Four cases of rebalancing exists-Left of Left,
Right of Right, Right of Left and Left of Right.
Case 1: Left of Left
• When the out of balance is created by a Left
high subtree of a left high tree, we must
balance the tree by rotating the out of
balance node to the right.
• Example (Simple right rotation)
• Example (complex right rotation)
Case 2: right of right
• This is a mirror of case 1.
• Example (Simple Left rotation)
• Example (complex Left rotation)
Case 3: Right of Left
• There is a two out of balance conditions in
which we need to rotate two nodes, one to
the left and one to the right to balance the
tree.
• Example (Simple double rotation)
• Example (complex double rotation)
Case 4: Left of Right
Sample exercise (Simple case with double
rotations)
Sample Exercise (Complex case with double
rotations)
Operations on AVL
1. Traversal
2. Searching
3. Insertion
4. Deletion

Note: Searching and Traversing algorithm is


same as that of binary search tree.
AVL Insert
Insertion: Happens as a leaf node comparing
every time with the node value. But here, as
we back out , we constantly check the
balance of each node. When we find that
node is out of balance, we balance it and then
continue up the tree.
Note: Not all inserts create out of balance
condition. To know the tree’s balance, we set
a flag when the tree has grown. This will help
us in knowing whether to rotate or not.
Algorithm Insert
• Algorithm Insert
• This algorithm uses recursion to insert the node into
an AVL tree.
• Pre: Root is a pointer to the first node in AVL.
Newptr is a pointer to the new node to be inserted.
• Post: Taller is a boolean, true indicating the subtree
height has increased, false indicating the same
height.
• Return: Root returned recursively up the tree.
1.If (root=null) //insert as the first node (at root)
1. taller=true
2.root=newptr
3. return root
2 Endif
3. if (newptr->data < root->data)
1. root->left = AVLinsert(root->left, newptr, taller)
2. if (taller) // leftsubtree is taller
1. if (root is LH)
1 root = Leftbalance(root,taller)
2. else if (root is EH)
1. root->bal = left high
3. else //was right high
1. root->bal = even-high
2. taller = false
4. endif
3. Endif
4 Else
.

1. root->right = AVLinsert(root->right,newptr,taller)
2. if (taller) //rightsubtree is taller
1. if (root is LH)
1. taller=false
2. root->bal = even-high
2. else if (root is EH)
1. root->bal= right-high
3. else
1. root= rightbalance(root,taller)
4. endif
3. endif
5. endif
6.Return root
End AVL insert
Rotating Algorithm
Rotate Right
Algorithm RotateRight (Root)
This algorithm exchanges pointers to rotate the tree
right.
1. TempPtr=Root->Left
2. Root->Left=TempPtr->Right
3. TempPtr->Right=Root
4. Root=TempPtr
5. return
Example tree
Rotating Algorithm
Rotate Left
Algorithm RotateLeft (Root)
This algorithm exchanges pointers to rotate the tree
left.
1. TempPtr=Root->Right
2. Root->Right=TempPtr->Left
3. TempPtr->Left=Root
4. Root=TempPtr
5. return
Example tree
Left Balance Algorithm
Algorithm Left Balance
This algorithm is entered when the root is left heavy (that is the
left subtree is higher than the right subtree)
Pre: root is a pointer to the root of the tree
taller is true
Post: root has been updated (if necessary)
taller has been updated
1. Lefttree=root->Left
2. if (Lefttree = Left - High) //case 1. Left of Left
1. rotateRight(root)
2. root->bal= even- high
3. Lefttree->bal =even- high
4. Taller=False
3 Else //Right of left case and double rotation
1 righttree= leftTree->Right
2. Adjust Balance factors
3. RotateLeft (Lefttree)
4. RotateRight (Root)
5. Taller=False
[End of if structure]
4. Return
5 End LeftBalance
AVL Delete
Algorithm AVLDelete
This algorithm deletes a node from an AVL tree
Pre:Root is pointer to a tree/subtree and
Delkey is the key of the node to be deleted.
Post: Node deleted if found, tree is unchanged if not
found. Shorter is true if subtree is shorter.
Return : pointer to root of new subtree
1. if (Tree=Null)
1. Shorter=false
2. Return null
2.endif
3. If (delkey<Root->key)
1. root->left = AVLDelete(Root->Left, deletekey,
shorter)
2. if (shorter)
1. root=DeleteRightBalance(Root,shorter)
3. endif
4 Else if (delkey > root->key)
1. root->right=AVLDelete (Root->right, deletekey, shorter)
2.if (shorter)
1. root= DeleteLeftBalance(root,shorter)
3. endif
5 Else
// delete node found—Test for leaf node
1. deleteNode=Root
2. if (No Left subtree)
1. Root=Root->Right
2. Shorter = true
3. Recycle (deleteNode)
4. Return success true
3. if (No Right subtree)
1. Root=Root->Left
2. Shorter = true
3. Recycle (deleteNode)
4. Return success true
4 else // deleted node has two subtrees
1. ExchPtr=Root->Left
2. Loop (ExchPtr->Right not Null)
1. ExchPtr=ExchPtr->Right
3. Root->Data=ExchPtr->Data
4.root->left= AVLDelete(Root->Left,ExchPtr-
>data, shorter)
5. if (shorter)
1. deleteRightBalance(Root,Shorter)
6. end if
5. End if
6. Return root
End AVLDelete
Variations of Balanced trees
Red Black tree
Motivation:
• Binary Search Trees should be balanced.
• AVL Trees need 2 passes: top-down insertion/deletion
and bottom-up rebalancing
Need recursive implementation
• Red-Black Trees need 1 pass: top-down rebalancing
and insertion/deletion
Can be implemented iteratively, faster
• Red-Black Trees have slightly weaker balance
restrictions
Less effort to maintain
In practice, worst case is similar to AVL Trees
Red black tree
• A red–black tree is a type of self-balancing binary search
tree

• The self-balancing is provided by painting each node with


one of two colors 'red' and 'black‘, in such a way that the
resulting painted tree satisfies certain properties that don't
allow it to become significantly unbalanced.

• When the tree is modified, the new tree is subsequently


rearranged and repainted to restore the coloring
properties.
Performance

It guarantees search operation in O(log n)


time, where n is the total number of elements
in the tree.

Insertion, and deletion operations, along with


the tree rearrangement and recoloring are
also performed in O(log n) time.
Red-Black Trees: Definition
Rules of Red-Black Trees:
1. Every node is colored either red or black
2. The root is black
3. If a node is red, its children must be black
– consecutive red nodes are disallowed
4. Every path from a node to a null reference must
contain the same number of black nodes

Convention: Null nodes are black



Red-Black Trees
The insertion sequence is 10, 85, 15, 70, 20, 60,
30, 50, 65, 80, 90, 40, 5, 55
30

15 70

10 20 60 85

5 50 65 80 90

40 55
Splay Trees
• In balanced tree schemes, explicit rules are
followed to ensure balance.
• In splay trees, there are no such rules.
• Search, insert, and delete operations are like
in binary search trees, except at the end of
each operation a special step called splaying is
done.
• Splaying ensure that all operations take O(lg n)

Comp 750, Fall 2009 Splay Trees - 96


Example
44

Splay(78) 50

17 88

32 65 97
zig-zag
28 54 82

z
29 76
y
80

x
78

Comp 750, Fall 2009 Splay Trees - 97


Example
44

Splay(78) 50

17 88

32 65 97
zig-zag
28 54 82

x
29 78

z 76 y
80

Comp 750, Fall 2009 Splay Trees - 98


Example
44

Splay(78) 50

17 88

z
32 65 97
zig-zag
y
28 54 82

29 78 x

76 80

Comp 750, Fall 2009 Splay Trees - 99


Example
44

Splay(78) 50

17 88

32 x 78 97
zig-zag
28 z 65 y 82

29 54 76 80

Comp 750, Fall 2009 Splay Trees - 100


Example
44
z
Splay(78) 50

17 88 y

32 x 78 97
zig-zag
28 65 82

29 54 76 80

Comp 750, Fall 2009 Splay Trees - 101


Example
44
x
Splay(78) 78

17 50 z 88 y

82
32 97
zig-zag 65
80
28
54 76

29

Comp 750, Fall 2009 Splay Trees - 102


Example
y
44
x
Splay(78) 78

17 50 88 w

82
32 97
zig 65
80
28
54 76

29

Comp 750, Fall 2009 Splay Trees - 103


Example 78 x
y 44

Splay(78)
17 50 88 w

82
32 97
zig 65
80
28
54 76

29

Comp 750, Fall 2009 Splay Trees - 104


Result of splaying

• The result is a binary tree, with the left


subtree having all keys less than the root, and
the right subtree having keys greater than the
root.
• Also, the final tree is “more balanced” than
the original.
• However, if an operation near the root is
done, the tree can become less balanced.
Comp 750, Fall 2009 Splay Trees - 105
• Search: When to Splay
– Successful: Splay node where key was found.
– Unsuccessful: Splay last-visited internal node (i.e., last
node with a key).
• Insert:
– Splay newly added node.
• Delete:
– Splay parent of removed node (which is either the node
with the deleted key or its successor).
• Note: All operations run in O(h) time, for a tree of
height h.

Comp 750, Fall 2009 Splay Trees - 106


Questions
1. For the given
Inorder: EACKFHDBG
Preorder: FAEKCDHGB of a binary tree, derive it’s
postorder and draw a tree.
2.Write a recursive algorithm to find the largest
node in a given Binary search tree.
3. Given In order: a + b – c * d – e / f + g - h
Write the tree for the above expression.
Derive its preorder and post order and also give
BFS result.

You might also like