Unit 4
Unit 4
TREE STRUCTURES
edges.
Trees lies on other spectrum of data structure when compared with data structures
In stack, queue and Linked list the data structure is linear and the elements are added in
sequence.
Any operation in linear data structure has increasing time complexity as the size of the
data increases, which makes it less suitable for the real world problems.
Tree Terminology
Vertex/Node — It contains the key or value or pointers to child vertex. Bottom most node is
called as leaf node or external node. It has no link going down from it. A node having a
Height of a Tree — It is the height of the root node or the depth of the deepest node.
Height of a Node — It is the number of edges from the node to the last leaf (deepest node)
of the tree.
Root — It is the top most node
Depth of a Node- It is the number of edges from the root to the node.
Tree represents the nodes connected by edges. It is a non-linear data structure. It has the
following properties −
We designate one node as root node and then add more nodes as child nodes.
o Create Root
We just create a Node class and add assign a value to the node. This becomes tree with only a
root node.
Example
class Node:
def __init__(self, data):
self.left = None
self.right = None
self.data = data
def PrintTree(self):
print(self.data)
root = Node(10)
root.PrintTree()
Output
When the above code is executed, it produces the following result −
10
o Inserting into a Tree
To insert into a tree we use the same node class created above and add a insert class to it.
The insert class compares the value of the node to the parent node and decides to add it as
a left node or a right node. Finally the PrintTree class is used to print the tree.
Example
class Node:
def __init__(self, data):
self.left = None
self.right = None
self.data = data
def insert(self, data):
# Compare the new value with the parent node
if self.data:
if data < self.data:
if self.left is None:
self.left = Node(data)
else:
self.left.insert(data)
elif data > self.data:
if self.right is None:
self.right = Node(data)
else:
self.right.insert(data)
else:
self.data = data
# Print the tree
def PrintTree(self):
if self.left:
self.left.PrintTree()
print( self.data),
if self.right:
self.right.PrintTree()
# Use the insert method to add nodes
root = Node(12)
root.insert(6)
root.insert(14)
root.insert(3)
root.PrintTree()
Output
When the above code is executed, it produces the following result −
3 6 12 14
Traversal is a process to visit all the nodes of a tree and may print their values too.
Because, all nodes are connected via edges (links) we always start from the root (head) node.
That is, we cannot randomly access a node in a tree. There are three ways which we use to
traverse a tree −
In-order Traversal
Pre-order Traversal
Post-order Traversal
we traverse a tree to search or locate a given item or key in the tree or to print all the values it
contains.
In-order Traversal
In this traversal method, the left sub tree is visited first, then the root and later the right
sub-tree. We should always remember that every node may represent a sub tree itself.
If a binary tree is traversed in-order, the output will produce sorted key values in an
ascending order.
We start from A, and following in-order traversal, we move to its left subtree B.B is also
traversed in-order.
The process goes on until all the nodes are visited. The output of in-order traversal of this
tree will be
D→B→E→A→F→C→G
Algorithm
Until all nodes are traversed
class Node:
def __init__(self, key):
self.leftChild = None
self.rightChild = None
self.data = key
OutPut:
Inorder traversal of binary tree is
54
26
65
3
12
42
Pre-order Traversal
In this traversal method, the root node is visited first, then the left sub tree and finally the
right sub tree.
We start from A, and following pre-order traversal, we first visit A itself and then move
to its left subtree B. B is also traversed pre-order.
The process goes on until all the nodes are visited. The output of pre-order traversal of
this tree will be
A→B→D→E→C→F→G
Algorithm
class Node:
def __init__(self, key):
self.leftChild = None
self.rightChild = None
self.data = key
# Create a function to perform postorder tree traversal
def PreorderTraversal(root):
if root:
print(root.data)
PreorderTraversal(root.leftChild)
PreorderTraversal(root.rightChild)
# Main class
if __name__ == "__main__":
root = Node(3)
root.leftChild = Node(26)
root.rightChild = Node(42)
root.leftChild.leftChild = Node(54)
root.leftChild.rightChild = Node(65)
root.rightChild.leftChild = Node(12)
print("Preorder traversal of binary tree is")
PreorderTraversal(root)
Output
Preorder traversal of binary tree is
3
26
54
65
42
12
Post-order Traversal
In this traversal method, the root node is visited last, hence the name. First we traverse
the left subtree, then the right subtree and finally the root node.
We start from A, and following pre-order traversal, we first visit the left subtree B. B is
also traversed post-order.
The process goes on until all the nodes are visited. The output of post-order traversal of
this tree will be
D→E→B→F→G→C→A
Algorithm
Until all nodes are traversed
Step 1 − Recursively traverse left subtree.
Step 2 − Recursively traverse right subtree.
Step 3 − Visit root node.
Example
Following are the implementations of this operation in various programming languages
class Node:
def __init__(self, key):
self.leftChild = None
self.rightChild = None
self.data = key
# Create a function to perform preorder tree traversal
def PostorderTraversal(root):
if root:
PostorderTraversal(root.leftChild)
PostorderTraversal(root.rightChild)
print(root.data)
# Main class
if __name__ == "__main__":
root = Node(3)
root.leftChild = Node(26)
root.rightChild = Node(42)
root.leftChild.leftChild = Node(54)
root.leftChild.rightChild = Node(65)
root.rightChild.leftChild = Node(12)
print("Postorder traversal of binary tree is")
PostorderTraversal(root)
Output
Post order traversal of binary tree is
54
65
26
12
42
3
4.4 Binary Search Tree
A Binary Search Tree (BST) is a tree in which all the nodes follow the below-mentioned
properties.
The left sub-tree of a node has a key less than or equal to its parent node's key.The right
sub-tree of a node has a key greater than to its parent node's key.
Thus, BST divides all its sub-trees into two segments; the left sub-tree and the right sub-
tree
Searching for a value in a tree involves comparing the incoming value with the value
exiting nodes.
Here also we traverse the nodes from left to right and then finally with the parent.
If the searched for value does not match any of the exiting value, then we return not
found message, or else they found message is returned.
Example
class Node:
def __init__(self, data):
self.left = None
self.right = None
self.data = data
# Insert method to create nodes
def insert(self, data):
if self.data:
if data < self.data:
if self.left is None:
self.left = Node(data)
else:
self.left.insert(data)
else data > self.data:
if self.right is None:
self.right = Node(data)
else:
self.right.insert(data)
else:
self.data = data
# findval method to compare the value with nodes
def findval(self, lkpval):
if lkpval < self.data:
if self.left is None:
return str(lkpval)+" Not Found"
return self.left.findval(lkpval)
else if lkpval > self.data:
if self.right is None:
return str(lkpval)+" Not Found"
return self.right.findval(lkpval)
else:
print(str(self.data) + ' is found')
# Print the tree
def PrintTree(self):
if self.left:
self.left.PrintTree()
print( self.data),
if self.right:
self.right.PrintTree()
root = Node(12)
root.insert(6)
root.insert(14)
root.insert(3)
print(root.findval(7))
print(root.findval(14))
Output
When the above code is executed, it produces the following result
7 Not Found
14 is found
4.5 AVL Trees
The first type of self-balancing binary search tree to be invented is the AVL tree.
The name AVL tree is coined after its inventor's names − Adelson-Velsky and Landis.
In AVL trees, the difference between the heights of left and right subtrees, known as
the Balance Factor, must be at most one.
Once the difference exceeds one, the tree automatically executes the balancing algorithm
until the difference becomes one again.
BALANCE FACTOR = HEIGHT(LEFT SUBTREE) − HEIGHT(RIGHT SUBTREE)
There are usually four cases of rotation in the balancing algorithm of AVL trees: LL, RR,
LR, RL.
LL Rotations
LL rotation is performed when the node is inserted into the right subtree leading to
an unbalanced tree.
Fig : LL Rotation
The node where the unbalance occurs becomes the left child and the newly added node
becomes the right child with the middle node as the parent node.
RR Rotations
RR rotation is performed when the node is inserted into the left subtree leading to an
unbalanced tree.
This is a single right rotation to make the tree balanced again
Fig : RR Rotation
The node where the unbalance occurs becomes the right child and the newly added node
becomes the left child with the middle node as the parent node.
LR Rotations
LR rotation is the extended version of the previous single rotations, also called a double
rotation.
It is performed when a node is inserted into the right subtree of the left subtree.
The LR rotation is a combination of the left rotation followed by the right rotation.
There are multiple steps to be followed to carry this out.
Consider an example with “A” as the root node, “B” as the left child of “A” and “C” as
the right child of “B”.
Since the unbalance occurs at A, a left rotation is applied on the child nodes of A, i.e. B
and C.
After the rotation, the C node becomes the left child of A and B becomes the left child of
C.
The unbalance still persists, therefore a right rotation is applied at the root node A and the
left child C.
After the final right rotation, C becomes the root node, A becomes the right child and B is
the left child.
Fig : LR Rotation
RL Rotations
RL rotation is also the extended version of the previous single rotations, hence it is called
a double rotation and it is performed if a node is inserted into the left subtree of the right
subtree.
The RL rotation is a combination of the right rotation followed by the left rotation.
There are multiple steps to be followed to carry this out.
Consider an example with “A” as the root node, “B” as the right child of “A” and “C” as
the left child of “B”.
Since the unbalance occurs at A, a right rotation is applied on the child nodes of A, i.e. B
and C.
After the rotation, the C node becomes the right child of A and B becomes the right child
of C.
The unbalance still persists; therefore a left rotation is applied at the root node A and the
right child C.
After the final left rotation, C becomes the root node, A becomes the left child and B is
the right child.
Fig : RL Rotation
Basic Operations of AVL Trees
The basic operations performed on the AVL Tree structures include all the operations
performed on a binary search tree
Since the AVL Tree at its core is actually just a binary search tree holding all its
properties.
Therefore, basic operations performed on an AVL Tree are
1. Insertion
2. Deletion.
Insertion Operation
The data is inserted into the AVL Tree by following the Binary Search Tree property of
insertion.
The left sub tree must contain elements less than the root value and right sub tree must
contain all the greater elements.
However, in AVL Trees, after the insertion of each element, the balance factor of the tree
is checked; if it does not exceed 1, the tree is left as it is.
But if the balance factor exceeds 1, a balancing algorithm is applied to readjust the tree
such that balance factor becomes less than or equal to 1 again.
Algorithm
The following steps are involved in performing the insertion operation of an AVL Tree –
Let us understand the insertion operation by constructing an example AVL tree with 1 to
7 integers.
Starting with the first element 1, we create a node and measure the balance, i.e., 0.
Since both the binary search property and the balance factor are satisfied, we insert
another element into the tree.
The balance factor for the two nodes are calculated and is found to be -1 (Height of left
subtree is 0 and height of the right subtree is 1). Since it does not exceed 1, we add
another element to the tree.
Now, after adding the third element, the balance factor exceeds 1 and becomes 2.
Therefore, rotations are applied. In this case, the RR rotation is applied since the
imbalance occurs at two right nodes.
Similarly, the next elements are inserted and rearranged using these rotations.
After rearrangement, we achieve the tree as
Example
class Node(object):
def __init__(self, data):
self.data = data
self.left = None
self.right = None
self.height = 1
class AVLTree(object):
def insert(self, root, key):
if not root:
return Node(key)
elif key < root.data:
root.left = self.insert(root.left, key)
else:
root.right = self.insert(root.right, key)
root.h = 1 + max(self.getHeight(root.left),
self.getHeight(root.right))
b = self.getBalance(root)
if b > 1 and key < root.left.data:
return self.rightRotate(root)
if b < -1 and key > root.right.data:
return self.leftRotate(root)
if b > 1 and key > root.left.data:
root.left = self.lefttRotate(root.left)
return self.rightRotate(root)
if b < -1 and key < root.right.data:
root.right = self.rightRotate(root.right)
return self.leftRotate(root)
return root
def leftRotate(self, z):
y = z.right
T2 = y.left
y.left = z
z.right = T2
z.height = 1 + max(self.getHeight(z.left),
self.getHeight(z.right))
y.height = 1 + max(self.getHeight(y.left),
self.getHeight(y.right))
return y
def rightRotate(self, z):
y = z.left
T3 = y.right
y.right = z
z.left = T3
z.height = 1 + max(self.getHeight(z.left),
self.getHeight(z.right))
y.height = 1 + max(self.getHeight(y.left),
self.getHeight(y.right))
return y
def getHeight(self, root):
if not root:
return 0
return root.height
def getBalance(self, root):
if not root:
return 0
return self.getHeight(root.left) -
self.getHeight(root.right)
def Inorder(self, root):
if root.left:
self.Inorder(root.left)
print(root.data)
if root.right:
self.Inorder(root.right)
Tree = AVLTree()
root = None
root = Tree.insert(root, 10)
root = Tree.insert(root, 13)
root = Tree.insert(root, 11)
root = Tree.insert(root, 14)
root = Tree.insert(root, 12)
root = Tree.insert(root, 15)
# Inorder Traversal
print("Inorder traversal of the AVL tree is")
Tree.Inorder(root)
Output
Deletion operation
Scenario 1 (Deletion of a leaf node) − If the node to be deleted is a leaf node, then it is deleted
without any replacement as it does not disturb the binary search tree property. However, the
balance factor may get disturbed, so rotations are applied to restore it.
Scenario 2 (Deletion of a node with one child) − If the node to be deleted has one child, replace
the value in that node with the value in its child node. Then delete the child node. If the balance
factor is disturbed, rotations are applied.
Scenario 3 (Deletion of a node with two child nodes) − If the node to be deleted has two child
nodes, find the inorder successor of that node and replace its value with the inorder successor
value. Then try to delete the inorder successor node. If the balance factor exceeds 1 after
deletion, apply balance algorithms.
Using the same tree given above, let us perform deletion in three scenarios −
However, element 6 is not a leaf node and has one child node attached to it.
In this case, we replace node 6 with its child node: node 5.
The balance of the tree becomes 1, and since it does not exceed 1 the tree is left as it is. If
we delete the element 5 further, we would have to apply the left rotations; either LL or
LR since the imbalance occurs at both 1-2-4 and 3-2-4.
The balance factor is disturbed after deleting the element 5, therefore we apply LL
rotation (we can also apply the LR rotation here).
Once the LL rotation is applied on path 1-2-4, the node 3 remains as it was supposed to
be the right child of node 2 (which is now occupied by node 4).
Hence, the node is added to the right sub tree of the node 2 and as the left child of the
node 4.
As mentioned in scenario 3, this node has two children. Therefore, we find its inorder
successor that is a leaf node (say, 3) and replace its value with the inorder successor.
The balance of the tree still remains 1, therefore we leave the tree as it is without
performing any rotations.
Example
class Node(object):
def __init__(self, data):
self.data = data
self.left = None
self.right = None
self.height = 1
class AVLTree(object):
def insert(self, root, key):
if not root:
return Node(key)
elif key < root.data:
root.left = self.insert(root.left, key)
else:
root.right = self.insert(root.right, key)
root.h = 1 + max(self.getHeight(root.left),
self.getHeight(root.right))
b = self.getBalance(root)
if b > 1 and key < root.left.data:
return self.rightRotate(root)
if b < -1 and key > root.right.data:
return self.leftRotate(root)
if b > 1 and key > root.left.data:
root.left = self.lefttRotate(root.left)
return self.rightRotate(root)
if b < -1 and key < root.right.data:
root.right = self.rightRotate(root.right)
return self.leftRotate(root)
return root
def delete(self, root, key):
if not root:
return root
elif key < root.data:
root.left = self.delete(root.left, key)
elif key > root.data:
root.right = self.delete(root.right, key)
else:
if root.left is None:
temp = root.right
root = None
return temp
elif root.right is None:
temp = root.left
root = None
return temp
temp = self.getMindataueNode(root.right)
root.data = temp.data
root.right = self.delete(root.right, temp.data)
if root is None:
return root
root.height = 1 + max(self.getHeight(root.left),
self.getHeight(root.right))
balance = self.getBalance(root)
if balance > 1 and self.getBalance(root.left) >= 0:
return self.rightRotate(root)
if balance < -1 and self.getBalance(root.right) <= 0:
return self.leftRotate(root)
if balance > 1 and self.getBalance(root.left) < 0:
root.left = self.leftRotate(root.left)
return self.rightRotate(root)
if balance < -1 and self.getBalance(root.right) > 0:
root.right = self.rightRotate(root.right)
return self.leftRotate(root)
return root
def leftRotate(self, z):
y = z.right
T2 = y.left
y.left = z
z.right = T2
z.height = 1 + max(self.getHeight(z.left),
self.getHeight(z.right))
y.height = 1 + max(self.getHeight(y.left),
self.getHeight(y.right))
return y
def rightRotate(self, z):
y = z.left
T3 = y.right
y.right = z
z.left = T3
z.height = 1 + max(self.getHeight(z.left),
self.getHeight(z.right))
y.height = 1 + max(self.getHeight(y.left),
self.getHeight(y.right))
return y
def getHeight(self, root):
if not root:
return 0
return root.height
def getBalance(self, root):
if not root:
return 0
return self.getHeight(root.left) - self.getHeight(root.right)
def Inorder(self, root):
if root.left:
self.Inorder(root.left)
print(root.data, end = " ")
if root.right:
self.Inorder(root.right)
Tree = AVLTree()
root = None
root = Tree.insert(root, 10)
root = Tree.insert(root, 13)
root = Tree.insert(root, 11)
root = Tree.insert(root, 14)
root = Tree.insert(root, 12)
root = Tree.insert(root, 15)
# Inorder Traversal
print("AVL Tree: ")
Tree.Inorder(root)
root = Tree.delete(root, 14)
print("\nAfter deletion: ")
Tree.Inorder(root)
Output
AVL Tree:
10 11 12 13 14 15
After deletion:
10 11 12 13 15
4.6 Heaps
Heap is a special tree structure in which each parent node is less than or equal to its child
node.
Then it is called a Min Heap. If each parent node is greater than or equal to its child node
then it is called a max heap.
It is very useful is implementing priority queues where the queue item with higher weight
age is given more priority in processing.
A detailed discussion on heaps is available in our website here. Please study it first if you
are new to heap data structure.
In this chapter we will see the implementation of heap data structure using python.
Create a Heap
heapify − This function converts a regular list to a heap. In the resulting heap the
smallest element gets pushed to the index position 0. But rest of the data elements are not
necessarily sorted.
heappush − This function adds an element to the heap without altering the current heap.
heappop − This function returns the smallest data element from the heap.
heapreplace − This function replaces the smallest data element with a new value
supplied in the function.
Creating a Heap
A heap is created by simply using a list of elements with the heapify function.
In the below example we supply a list of elements and the heapify function rearranges the
elements bringing the smallest element to the first position.
Example
import heapq
H = [21,1,45,78,3,5]
# Use heapify to rearrange the elements
heapq.heapify(H)
print(H)
Output
When the above code is executed, it produces the following result
[1, 3, 5, 78, 21, 45]
Inserting a data element to a heap always adds the element at the last index.
But you can apply heapify function again to bring the newly added element to the first
index only if it smallest in value.
In the below example we insert the number 8.
Example
import heapq
H = [21,1,45,78,3,5]
# Covert to a heap
heapq.heapify(H)
print(H)
# Add element
heapq.heappush(H,8)
print(H)
Output
When the above code is executed, it produces the following result −
[1, 3, 5, 78, 21, 45]
[1, 3, 5, 78, 21, 45, 8]
You can remove the element at first index by using this function.
In the below example the function will always remove the element at the index position
1.
Example
import heapq
H = [21,1,45,78,3,5]
# Create the heap
heapq.heapify(H)
print(H)
# Remove element from the heap
heapq.heappop(H)
print(H)
Output
Introduction
Multi-way search trees are a generalized version of binary trees that allow for
efficient data searching and sorting.
In contrast to binary search trees, which can only have two children per node, multi-way
search trees can have multiple children per node.
Types of Multi-way Search Trees
There are several types of multi-way search trees, each with unique features and use
cases.
The two common types are
1. m-way search tree
2. B-tree.
m-WAY Search Trees
The m-way search trees are multi-way trees which are generalised versions of binary
trees where each node contains multiple elements.
In an m-Way tree of order m, each node contains a maximum of m – 1 elements and m
children.
The goal of m-Way search tree of height h calls for O(h) no. of accesses for an
insert/delete/retrieval operation.
Hence, it ensures that the height h is close to log_m(n + 1).
The number of elements in an m-Way search tree of height h ranges from a minimum
of h to a maximum of n.
An m-Way search tree of n elements ranges from a minimum height of log_m(n+1) to a
maximum of n.
An example of a 5-Way search tree is shown in the figure below. Observe how each
node has at most 5 child nodes & therefore has at most 4 keys contained in it.
class node:
def __init__(self):
self.count = -1
self.value = [-1]*(MAX + 1)
self.child = [None]*(MAX + 1)
Here, count represents the number of children that a particular node has
The values of a node stored in the array value
The addresses of child nodes are stored in the child array
The MAX macro signifies the maximum number of values that a particular node can
contain
Searching for a key in an m-Way search tree is similar to that of binary search tree
To search for 77 in the 5-Way search tree, shown in the figure, we begin at the root & as
77> 76> 44> 18, move to the fourth sub-tree
In the root node of the fourth sub-tree, 77< 80 & therefore we move to the first sub-tree of
the node. Since 77 is available in the only node of this sub-tree, we claim 77 was
successfully searched
search():
Every node in a B Tree will hold a maximum of m children and (m-1) keys, since the
order of the tree is m.
Every node in a B tree, except root and leaf, can hold at least m/2 children
The root node must have no less than two children.
All the paths in a B tree must end at the same level, i.e. the leaf nodes must be at the same
level.
A B tree always maintains sorted data.
B trees are also widely used in disk access, minimizing the disk access time since the
height of a b tree is low.
Note − A disk access is the memory access to the computer disk where the information is
stored and disk access time is the time taken by the system to access the disk memory.
The operations supported in B trees are Insertion, deletion and searching with the time
complexity of O(log n) for every operation.
Insertion operation
The insertion operation for a B Tree is done similar to the Binary Search Tree but the
elements are inserted into the same node until the maximum keys are reached.
The insertion is done using the following procedure −
Step 1 − Calculate the maximum (m−1)(m−1) and, minimum (⌈m2⌉−1)(⌈m2⌉−1) number
of keys a node can hold, where m is denoted by the order of the B Tree.
Step 2 − The data is inserted into the tree using the binary search insertion and once the keys
reach the maximum number, the node is split into half and the median key becomes the internal
node while the left and right keys become its children.
The keys, 5, 3, 21, 9, 13 are all added into the node according to the binary search
property but if we add the key 22, it will violate the maximum key property.
Hence, the node is split in half, the median key is shifted to the parent node and the
insertion is then continued.
Another hiccup occurs during the insertion of 11, so the node is split and median is
shifted to the parent.
While inserting 16, even if the node is split in two parts, the parent node also overflows as
it reached the maximum keys.
Hence, the parent node is split first and the median key becomes the root.
Then, the leaf node is split in half the median of leaf node is shifted to its parent.
The final B tree after inserting all the elements is achieved.
Output
Insertion Done
B tree:
10 20 30 40 50
Deletion operation
The deletion operation in a B tree is slightly different from the deletion operation of a
Binary Search Tree.
The procedure to delete a node from a B tree is as follows –
Case 1 − If the key to be deleted is in a leaf node and the deletion does not violate the
minimum key property, just delete the node.
Case 2 − If the key to be deleted is in a leaf node but the deletion violates the minimum
key property, borrow a key from either its left sibling or right sibling. In case if both
siblings have exact minimum number of keys, merge the node in either of them.
Case 3 − If the key to be deleted is in an internal node, it is replaced by a key in either
left child or right child based on which child has more keys. But if both child nodes
have minimum number of keys, they’re merged together.
Case 4 − If the key to be deleted is in an internal node violating the minimum keys
property, and both its children and sibling have minimum number of keys, merge the
children. Then merge its sibling with its parent.
#Deletion operation in BTree
class BTreeNode:
def __init__(self, leaf=False):
self.leaf = leaf
self.keys = []
self.child = []
class BTree:
def __init__(self, t):
self.root = BTreeNode(True)
self.t = t
# Insert a key in BTree
def insert(self, k):
root = self.root
if len(root.keys) == (2 * self.t) - 1:
temp = BTreeNode()
self.root = temp
temp.child.insert(0, root)
self.split_child(temp, 0)
self.insert_non_full(temp, k)
else:
self.insert_non_full(root, k)
# Insert if BTree is non full
def insert_non_full(self, x, k):
i = len(x.keys) - 1
if x.leaf:
x.keys.append((None, None))
while i >= 0 and k[0] < x.keys[i][0]:
x.keys[i + 1] = x.keys[i]
i -= 1
x.keys[i + 1] = k
else:
while i >= 0 and k[0] < x.keys[i][0]:
i -= 1
i += 1
if len(x.child[i].keys) == (2 * self.t) - 1:
self.split_child(x, i)
if k[0] > x.keys[i][0]:
i += 1
self.insert_non_full(x.child[i], k)
# Split the child of BTree
def split_child(self, x, i):
t = self.t
y = x.child[i]
z = BTreeNode(y.leaf)
x.child.insert(i + 1, z)
x.keys.insert(i, y.keys[t - 1])
z.keys = y.keys[t: (2 * t) - 1]
y.keys = y.keys[0: t - 1]
if not y.leaf:
z.child = y.child[t: 2 * t]
y.child = y.child[0: t - 1]
# Delete a node in BTree
def delete(self, x, k):
t = self.t
i = 0
while i < len(x.keys) and k[0] > x.keys[i][0]:
i += 1
if x.leaf:
if i < len(x.keys) and x.keys[i][0] == k[0]:
x.keys.pop(i)
return
return
if i < len(x.keys) and x.keys[i][0] == k[0]:
return self.delete_internal_node(x, k, i)
elif len(x.child[i].keys) >= t:
self.delete(x.child[i], k)
else:
if i != 0 and i + 2 < len(x.child):
if len(x.child[i - 1].keys) >= t:
self.delete_sibling(x, i, i - 1)
elif len(x.child[i + 1].keys) >= t:
self.delete_sibling(x, i, i + 1)
else:
self.delete_merge(x, i, i + 1)
elif i == 0:
if len(x.child[i + 1].keys) >= t:
self.delete_sibling(x, i, i + 1)
else:
self.delete_merge(x, i, i + 1)
elif i + 1 == len(x.child):
if len(x.child[i - 1].keys) >= t:
self.delete_sibling(x, i, i - 1)
else:
self.delete_merge(x, i, i - 1)
self.delete(x.child[i], k)
# display the BTree
def display(self, x, l=0):
for i in x.keys:
print(i, end=" ")
print()
l += 1
if len(x.child) > 0:
for i in x.child:
self.display(i, l)
B = BTree(3)
for i in range(10):
B.insert((i, 2 * i))
print("Insertion Done")
print("BTree elements before deletion: ")
B.display(B.root)
B.delete(B.root, (8,))
print("BTree elements after deletion: ")
B.display(B.root)
Output
Insertion Done
BTree elements before deletion:
8 9 10 11 15 16 17 18 20 23
The element to be deleted: 20
BTree elements before deletion:
8 9 10 11 15 16 17 18 23 8 9 23