0% found this document useful (0 votes)
181 views73 pages

AVL Tree - Wikipedia PDF

An AVL tree is a self-balancing binary search tree where the heights of the two child subtrees of any node differ by at most one. This property is maintained during insertions and deletions via rotations, keeping the tree balanced and lookups, insertions, and deletions taking O(log n) time. The AVL tree was one of the first self-balancing binary search trees invented by Georgy Adelson-Velsky and Evgenii Landis in 1962.

Uploaded by

Sheik Mydeen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
181 views73 pages

AVL Tree - Wikipedia PDF

An AVL tree is a self-balancing binary search tree where the heights of the two child subtrees of any node differ by at most one. This property is maintained during insertions and deletions via rotations, keeping the tree balanced and lookups, insertions, and deletions taking O(log n) time. The AVL tree was one of the first self-balancing binary search trees invented by Georgy Adelson-Velsky and Evgenii Landis in 1962.

Uploaded by

Sheik Mydeen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 73

AVL tree

In computer science, an AVL tree (named


after inventors Adelson-Velsky and Landis) is
a self-balancing binary search tree. It was the
first such data structure to be invented.[2] In
an AVL tree, the heights of the two child
subtrees of any node differ by at most one; if
at any time they differ by more than one,
rebalancing is done to restore this property.
Lookup, insertion, and deletion all take
O(log n) time in both the average and worst
cases, where is the number of nodes in the
tree prior to the operation. Insertions and
deletions may require the tree to be
rebalanced by one or more tree rotations.
AVL tree
Type tree

Invented 1962

Invented by Georgy Adelson-Velsky


and Evgenii Landis

Time complexity in big O notation

Algorithm Average Worst


case

Space

Search
[1] [1]

Insert
[1] [1]

Delete
[1] [1]
Animation showing the insertion of several elements into
an AVL tree. It includes left, right, left-right and right-left
rotations.

Fig. 1: AVL tree with balance factors (green)

The AVL tree is named after its two Soviet


inventors, Georgy Adelson-Velsky and Evgenii
Landis, who published it in their 1962 paper
"An algorithm for the organization of
information".[3]

AVL trees are often compared with red–black


trees because both support the same set of
operations and take time for the
basic operations. For lookup-intensive
applications, AVL trees are faster than red–
black trees because they are more strictly
balanced.[4] Similar to red–black trees, AVL
trees are height-balanced. Both are, in general,
neither weight-balanced nor -balanced for
any ;[5] that is, sibling nodes can have
hugely differing numbers of descendants.

Definition
Balance factor …

In a binary tree the balance factor of a node


is defined to be the height difference

[6]

of its two child sub-trees. A binary tree is


defined to be an AVL tree if the invariant

[7]

holds for every node in the tree.

A node with is
called "left-heavy", one with
is called "right-
heavy", and one with
is sometimes
simply called "balanced".

Remark

In what follows, because there is a one-to-one


correspondence between nodes and the sub-
trees rooted by them, the name of an object is
sometimes used to refer to the node and
sometimes used to refer to the sub-tree.

Properties …

Balance factors can be kept up-to-date by


knowing the previous balance factors and the
change in height – it is not necessary to know
the absolute height. For holding the AVL
balance information in the traditional way, two
bits per node are sufficient. However, later
research showed if the AVL tree is
implemented as a rank balanced tree with
delta ranks allowed of 1 or 2 – with meaning
"when going upward there is an additional
increment in height of one or two", this can be
done with one bit.

The height h (counted as number of edges on


the longest path) of an AVL tree with n nodes
lies in the interval:[8]

with the golden ratio


φ := (1+√5) ⁄2 ≈ 1.6180, c := 1⁄ log2 φ ≈ 1.4405,
 and  b := c⁄2 log2 5 – 3 ≈ –1.3277. This is
because an AVL tree of height h contains at
least Fh+2 – 1 nodes where {Fh} is the
Fibonacci sequence with the seed values
F1 = 1, F2 = 1.

Operations
Read-only operations of an AVL tree involve
carrying out the same actions as would be
carried out on an unbalanced binary search
tree, but modifications have to observe and
restore the height balance of the sub-trees.

Searching …

Searching for a specific key in an AVL tree can


be done the same way as that of any balanced
or unbalanced binary search tree.[9]:ch. 8 In
order for search to work effectively it has to
employ a comparison function which
establishes a total order (or at least a total
preorder) on the set of keys.[10]:23 The number
of comparisons required for successful
search is limited by the height h and for
unsuccessful search is very close to h, so
both are in O(log n).[11]:216

Traversal …

This section needs additional citations for


verification. Learn more

Once a node has been found in an AVL tree,


the next or previous node can be accessed in
amortized constant time. Some instances of
exploring these "nearby" nodes require
traversing up to h ∝ log(n) links (particularly
when navigating from the rightmost leaf of the
root's left subtree to the root or from the root
to the leftmost leaf of the root's right subtree;
in the AVL tree of figure 1, moving from node
P to the next but one node Q takes 3 steps).
However, exploring all n nodes of the tree in
this manner would visit each link exactly
twice: one downward visit to enter the subtree
rooted by that node, another visit upward to
leave that node's subtree after having
explored it. And since there are n−1 links in
any tree, the amortized cost is 2×(n−1)/n, or
approximately 2.

Insert …
This section has multiple issues. Please help
improve it or discuss these issues on the talk page.

When inserting an element into an AVL tree,


you initially follow the same process as
inserting into a Binary Search Tree. More
explicitly: In case a preceding search has not
been successful the search routine returns the
tree itself with indication EMPTY and the new
node is inserted as root. Or, if the tree has not
been empty the search routine returns a node
and a direction (left or right) where the
returned node does not have a child. Then the
node to be inserted is made child of the
returned node at the returned direction.

After this insertion it is necessary to check


each of the node's ancestors for consistency
with the invariants of AVL trees: this is called
"retracing". This is achieved by considering the
balance factor of each node.[12][13]

Since with a single insertion the height of an


AVL subtree cannot increase by more than
one, the temporary balance factor of a node
after an insertion will be in the range [–2,+2].
For each node checked, if the temporary
balance factor remains in the range from –1
to +1 then only an update of the balance
factor and no rotation is necessary. However,
if the temporary balance factor becomes less
than –1 or greater than +1, the subtree rooted
at this node is AVL unbalanced, and a rotation
is needed.[10]:52 With insertion as the code
below shows, the adequate rotation
immediately perfectly rebalances the tree.

In figure 1, by inserting the new node Z as a


child of node X the height of that subtree Z
increases from 0 to 1.

Invariant of the retracing loop for an


insertion

The height of the subtree rooted by Z has


increased by 1. It is already in AVL shape.
Example code for an insert operation

1 for (X = parent(Z); X !=
null; X = parent(Z)) { //
Loop (possibly up to the
root)
2 // BalanceFactor(X)
has to be updated:
3 if (Z ==
right_child(X)) { // The
right subtree increases
4 if
(BalanceFactor(X) > 0) { //
X is right-heavy
5 // ===> the
temporary BalanceFactor(X)
== +2
6 // ===>
rebalancing is required.
7 G =
parent(X); // Save parent of
X around rotations
8 if
(BalanceFactor(Z) < 0)
// Right Left Case (see
figure 5)
9 N =
rotate_RightLeft(X, Z); //
Double rotation: Right(Z)
then Left(X)
10 else
// Right Right Case (see
figure 4)
11 N =
rotate_Left(X, Z); //
Single rotation Left(X)
12 // After
rotation adapt parent link
13 } else {
14 if
(BalanceFactor(X) < 0) {
15
BalanceFactor(X) = 0; // Z’s
height increase is absorbed
at X.
16 break; //
Leave the loop
17 }
18
BalanceFactor(X) = +1;
19 Z = X; //
Height(Z) increases by 1
20 continue;
21 }
22 } else { // Z ==
left_child(X): the left
subtree increases
23 if
(BalanceFactor(X) < 0) { //
X is left-heavy
24 // ===> the
temporary BalanceFactor(X)
== –2
25 // ===>
rebalancing is required.
26 G =
parent(X); // Save parent of
X around rotations
27 if
(BalanceFactor(Z) > 0)
// Left Right Case
28 N =
rotate_LeftRight(X, Z); //
Double rotation: Left(Z)
then Right(X)
29 else
// Left Left Case
30 N =
rotate_Right(X, Z); //
Single rotation Right(X)
31 // After
rotation adapt parent link
32 } else {
33 if
(BalanceFactor(X) > 0) {
34
BalanceFactor(X) = 0; // Z’s
height increase is absorbed
at X.
35 break; //
Leave the loop
36 }
37
BalanceFactor(X) = –1;
38 Z = X; //
Height(Z) increases by 1
39 continue;
40 }
41 }
42 // After a rotation
adapt parent link:
43 // N is the new root
of the rotated subtree
44 // Height does not
change: Height(N) == old
Height(X)
45 parent(N) = G;
46 if (G != null) {
47 if (X ==
left_child(G))
48 left_child(G)
= N;
49 else
50
right_child(G) = N;
51 } else
52 tree->root = N;
// N is the new root of the
total tree
53 break;
54 // There is no fall
thru, only break; or
continue;
55 }
56 // Unless loop is left
via break, the height of the
total tree increases by 1.

In order to update the balance factors of all


nodes, first observe that all nodes requiring
correction lie from child to parent along the
path of the inserted leaf. If the above
procedure is applied to nodes along this path,
starting from the leaf, then every node in the
tree will again have a balance factor of −1, 0,
or 1.

The retracing can stop if the balance factor


becomes 0 implying that the height of that
subtree remains unchanged.

If the balance factor becomes ±1 then the


height of the subtree increases by one and the
retracing needs to continue.

If the balance factor temporarily becomes ±2,


this has to be repaired by an appropriate
rotation after which the subtree has the same
height as before (and its root the balance
factor 0).
The time required is O(log n) for lookup, plus
a maximum of O(log n) retracing levels (O(1)
on average) on the way back to the root, so
the operation can be completed in O(log n)
time.[10]:53

Delete …

The preliminary steps for deleting a node are


described in section Binary search
tree#Deletion. There, the effective deletion of
the subject node or the replacement node
decreases the height of the corresponding
child tree either from 1 to 0 or from 2 to 1, if
that node had a child.
Starting at this subtree, it is necessary to
check each of the ancestors for consistency
with the invariants of AVL trees. This is called
"retracing".

Since with a single deletion the height of an


AVL subtree cannot decrease by more than
one, the temporary balance factor of a node
will be in the range from −2 to +2. If the
balance factor remains in the range from −1 to
+1 it can be adjusted in accord with the AVL
rules. If it becomes ±2 then the subtree is
unbalanced and needs to be rotated. (Unlike
insertion where a rotation always balances the
tree, after delete, there may be BF(Z) ≠ 0 (see
fig.s 4 and 5), so that after the appropriate
single or double rotation the height of the
rebalanced subtree decreases by one
meaning that the tree has to be rebalanced
again on the next higher level.) The various
cases of rotations are described in section
Rebalancing.

Invariant of the retracing loop for a deletion

The height of the subtree rooted by N has


decreased by 1. It is already in AVL shape.
Example code for a delete operation

1 for (X = parent(N); X !=
null; X = G) { // Loop
(possibly up to the root)
2 G = parent(X); //
Save parent of X around
rotations
3 // BalanceFactor(X)
has not yet been updated!
4 if (N ==
left_child(X)) { // the left
subtree decreases
5 if
(BalanceFactor(X) > 0) { //
X is right-heavy
6 // ===> the
temporary BalanceFactor(X)
== +2
7 // ===>
rebalancing is required.
8 Z =
right_child(X); // Sibling
of N (higher by 2)
9 b =
BalanceFactor(Z);
10 if (b < 0)
// Right Left Case (see
figure 5)
11 N =
rotate_RightLeft(X, Z); //
Double rotation: Right(Z)
then Left(X)
12 else
// Right Right Case (see
figure 4)
13 N =
rotate_Left(X, Z); //
Single rotation Left(X)
14 // After
rotation adapt parent link
15 } else {
16 if
(BalanceFactor(X) == 0) {
17
BalanceFactor(X) = +1; //
N’s height decrease is
absorbed at X.
18 break; //
Leave the loop
19 }
20 N = X;
21
BalanceFactor(N) = 0; //
Height(N) decreases by 1
22 continue;
23 }
24 } else { // (N ==
right_child(X)): The right
subtree decreases
25 if
(BalanceFactor(X) < 0) { //
X is left-heavy
26 // ===> the
temporary BalanceFactor(X)
== –2
27 // ===>
rebalancing is required.
28 Z =
left_child(X); // Sibling of
N (higher by 2)
29 b =
BalanceFactor(Z);
30 if (b > 0)
// Left Right Case
31 N =
rotate_LeftRight(X, Z); //
Double rotation: Left(Z)
then Right(X)
32 else
// Left Left Case
33 N =
rotate_Right(X, Z); //
Single rotation Right(X)
34 // After
rotation adapt parent link
35 } else {
36 if
(BalanceFactor(X) == 0) {
37
BalanceFactor(X) = –1; //
N’s height decrease is
absorbed at X.
38 break; //
Leave the loop
39 }
40 N = X;
41
BalanceFactor(N) = 0; //
Height(N) decreases by 1
42 continue;
43 }
44 }
45 // After a rotation
adapt parent link:
46 // N is the new root
of the rotated subtree
47 parent(N) = G;
48 if (G != null) {
49 if (X ==
left_child(G))
50 left_child(G)
= N;
51 else
52
right_child(G) = N;
53 if (b == 0)
54 break; //
Height does not change:
Leave the loop
55 } else {
56 tree->root = N;
// N is the new root of the
total tree
57 }
58 // Height(N)
decreases by 1 (== old
Height(X)-1)
59 }
60 // Unless loop is left
via break, the height of the
total tree decreases by 1.

The retracing can stop if the balance factor


becomes ±1 (it must have been 0) meaning
that the height of that subtree remains
unchanged.

If the balance factor becomes 0 (it must have


been ±1) then the height of the subtree
decreases by one and the retracing needs to
continue.

If the balance factor temporarily becomes ±2,


this has to be repaired by an appropriate
rotation. It depends on the balance factor of
the sibling Z (the higher child tree in fig. 4)
whether the height of the subtree decreases
by one –and the retracing needs to continue–
or does not change (if Z has the balance
factor 0) and the whole tree is in AVL-shape.
The time required is O(log n) for lookup, plus
a maximum of O(log n) retracing levels (O(1)
on average) on the way back to the root, so
the operation can be completed in O(log n)
time.

Set operations and bulk operations …

In addition to the single-element insert, delete


and lookup operations, several set operations
have been defined on AVL trees: union,
intersection and set difference. Then fast bulk
operations on insertions or deletions can be
implemented based on these set functions.
These set operations rely on two helper
operations, Split and Join. With the new
operations, the implementation of AVL trees
can be more efficient and highly-
parallelizable.[14]

Join: The function Join is on two AVL trees


t1 and t2 and a key k will return a tree
containing all elements in t1, t2 as well as k.
It requires k to be greater than all keys in t1
and smaller than all keys in t2. If the two
trees differ by height at most one, Join
simply create a new node with left subtree
t1, root k and right subtree t2. Otherwise,
suppose that t1 is higher than t2 for more
than one (the other case is symmetric). Join
follows the right spine of t1 until a node c
which is balanced with t2. At this point a
new node with left child c, root k and right
child t2 is created to replace c. The new
node satisfies the AVL invariant, and its
height is one greater than c. The increase in
height can increase the height of its
ancestors, possibly invalidating the AVL
invariant of those nodes. This can be fixed
either with a double rotation if invalid at the
parent or a single left rotation if invalid
higher in the tree, in both cases restoring
the height for any further ancestor nodes.
Join will therefore require at most two
rotations. The cost of this function is the
difference of the heights between the two
input trees.
Pseudocode implementation for the join al

function joinRightAVL(TL, k, TR)


(l,k',c)=expose(TL)
if (h(c)<h(TR)+1)
T'=Node(c,k,TR)
if (h(T')<=h(l)+1) then r
Node(l,k',T')
else return
rotateLeft(Node(l,k'rotateRight
else
T'=joinRightAVL(c,k,TR)
T=Node(l,k',T')
if (h(TL)>h(TR)+1) retur
else return rotateLeft(T
function joinLeftAVL(TL, k, TR)
/* symmetric to joinRightAVL *
function join(TL, k, TR)
if (h(TL)>h(TR)+1) return
joinRightAVL(TL, k, TR)
if (h(TR)>h(TL)+1) return
joinLeftAVL(TL, k, TR)
return Node(TL,k,TR)

Here of a node the height of . expose


means to extract a tree node 's left child , th
the node , and the right child . Node(l,k,r) m
create a node of left child , key , and right ch

Split: To split an AVL tree into two smaller


trees, those smaller than key x, and those
larger than key x, first draw a path from the
root by inserting x into the AVL. After this
insertion, all values less than x will be found
on the left of the path, and all values greater
than x will be found on the right. By applying
Join, all the subtrees on the left side are
merged bottom-up using keys on the path
as intermediate nodes from bottom to top
to form the left tree, and the right part is
asymmetric. The cost of Split is ,
order of the height of the tree.
Pseudocode implementation for the split
algorithm

function split(T,k)
if (T=nil) return
(nil,false,nil)
(L,(m,c),R)=expose(T)
if (k=m) return
(L,true,R)
if (k<m)
(L',b,R')=split(L,k)
return
(L',b,join(R',m,R))
if (k>m)
(L',b,R')=split(R,k)
return
(join(L,m,L'),b,R))
The union of two AVLs t1 and t2 representing
sets A and B, is an AVL t that represents
A ∪ B.
Pseudocode implementation for the union
algorithm

function union(t1, t2):


if t1 = nil:
return t2
if t2 = nil:
return t1
t<, t> ← split t2 on
t1.root
return
join(t1.root,union(left(t1),
t<),union(right(t1), t>))

Here, Split is presumed to return two trees:


one holding the keys less its input key, one
holding the greater keys. (The algorithm is
non-destructive, but an in-place destructive
version exists as well.)

The algorithm for intersection or difference is


similar, but requires the Join2 helper routine
that is the same as Join but without the
middle key. Based on the new functions for
union, intersection or difference, either one
key or multiple keys can be inserted to or
deleted from the AVL tree. Since Split calls
Join but does not deal with the balancing
criteria of AVL trees directly, such an
implementation is usually called the "join-
based" implementation.
The complexity of each of union, intersection

and difference is for

AVLs of sizes and . More


importantly, since the recursive calls to union,
intersection or difference are independent of
each other, they can be executed in parallel
with a parallel depth .[14]
When , the join-based implementation
has the same computational DAG as single-
element insertion and deletion.

Rebalancing
If during a modifying operation (e.g. insert,
delete) a (temporary) height difference of
more than one arises between two child
subtrees, the parent subtree has to be
"rebalanced". The given repair tools are the so-
called tree rotations, because they move the
keys only "vertically", so that the ("horizontal")
in-order sequence of the keys is fully
preserved (which is essential for a binary-
search tree).[12][13]

Let X be the node that has a (temporary)


balance factor of −2 or +2. Its left or right
subtree was modified. Let Z be the higher
child. Note that Z is in AVL shape by induction
hypothesis.

In case of insertion this insertion has


happened to one of Z's children in a way that
Z's height has increased. In case of deletion
this deletion has happened to the sibling t1 of
Z in a way so that t1's height being already
lower has decreased. (In that case Z's balance
factor may be 0.)

There are four situations that might arise. We


will describe them as Dir1 Dir2, where Dir1
comes from the set { left, right } and Dir2 as a
balance factor comes from the set { left-heavy
= −1, balanced = 0, right-heavy = +1 }.[15]

Situation Dir1 Dir2 denotes:

Z is a Dir1 child of its parent and


Z is Dir2-heavy if Dir2 != Dir1
Z is not (−Dir2)-heavy if Dir2 == Dir1

i.e.
child of its
=> Z parent X (i.e. (see
Right
is a and Z is BalanceFactor(Z) figure
Right
right not left- ≥ 0) 4)
heavy
child of its
=> Z parent X (i.e.
Left
is a and Z is BalanceFactor(Z)
Left
left not right- ≤ 0)
heavy
child of its
=> Z (i.e. (see
Right parent X
is a BalanceFactor(Z) figure
Left and Z is
right = −1) 5)
left-heavy
Left => Z child of its (i.e.
Right is a parent X BalanceFactor(Z)
left and Z is = +1)
right-
heavy

The balance violation of case Dir1 == Dir2 is


repaired by a simple rotation rotate_(−Dir1)
(rotate_Left in figure 4 resp. its mirror
rotate_Right).

The case Dir1 != Dir2 is repaired by a double


rotation rotate_(−Dir2)(−Dir1) ==
rotate_Dir1Dir2 (rotate_RightLeft in figure 5
resp. its mirror rotate_LeftRight).

The cost of a rotation, both simple and double,


is constant.

Simple rotation …
Figure 4 shows a Right Right situation. In its
upper half, node X has two child trees with a
balance factor of +2. Moreover, the inner child
t23 of Z (i.e., left child when Z is right child
resp. right child when Z is left child) is not
higher than its sibling t4. This can happen by a
height increase of subtree t4 or by a height
decrease of subtree t1. In the latter case, also
the pale situation where t23 has the same
height as t4 may occur.

The result of the left rotation is shown in the


lower half of the figure. Three links (thick
edges in figure 4) and two balance factors are
to be updated.

As the figure shows, before an insertion, the


leaf layer was at level h+1, temporarily at level
h+2 and after the rotation again at level h+1.
In case of a deletion, the leaf layer was at level
h+2, where it is again, when t23 and t4 were of
same height. Otherwise the leaf layer reaches
level h+1, so that the height of the rotated tree
decreases.

Fig. 4: Simple rotation


rotate_Left(X,Z)
Code snippet of a simple left rotation

Input: X = root of subtree to be rotated left


Z = right child of X, Z is right-heavy
    with height ==
Height(LeftSubtree(X))+2
Result: new root of rebalanced subtree

1 node *rotate_Left(node *X,


node *Z) {
2 // Z is by 2 higher
than its sibling
3 t23 = left_child(Z); //
Inner child of Z
4 right_child(X) = t23;
5 if (t23 != null)
6 parent(t23) = X;
7 left_child(Z) = X;
8 parent(X) = Z;
9 // 1st case,
BalanceFactor(Z) == 0, only
happens with deletion, not
insertion:
10 if (BalanceFactor(Z) ==
0) { // t23 has been of same
height as t4
11 BalanceFactor(X) =
+1; // t23 now higher
12 BalanceFactor(Z) =
–1; // t4 now lower than X
13 } else { // 2nd case
happens with insertion or
deletion:
14 BalanceFactor(X) =
0;
15 BalanceFactor(Z) =
0;
16 }
17 return Z; // return new
root of rotated subtree
18 }

Double rotation …

Figure 5 shows a Right Left situation. In its


upper third, node X has two child trees with a
balance factor of +2. But unlike figure 4, the
inner child Y of Z is higher than its sibling t4.
This can happen by the insertion of Y itself or
a height increase of one of its subtrees t2 or t3
(with the consequence that they are of
different height) or by a height decrease of
subtree t1. In the latter case, it may also occur
that t2 and t3 are of same height.

The result of the first, the right, rotation is


shown in the middle third of the figure. (With
respect to the balance factors, this rotation is
not of the same kind as the other AVL single
rotations, because the height difference
between Y and t4 is only 1.) The result of the
final left rotation is shown in the lower third of
the figure. Five links (thick edges in figure 5)
and three balance factors are to be updated.

As the figure shows, before an insertion, the


leaf layer was at level h+1, temporarily at level
h+2 and after the double rotation again at
level h+1. In case of a deletion, the leaf layer
was at level h+2 and after the double rotation
it is at level h+1, so that the height of the
rotated tree decreases.

Fig. 5: Double rotation rotate_RightLeft(X,Z)


= rotate_Right around Z followed by
rotate_Left around X

Code snippet of a right-left double rotation

Input: X = root of subtree to be rotated


Z = its right child, left-heavy
    with height ==
Height(LeftSubtree(X))+2
Result: new root of rebalanced subtree

1 node *rotate_RightLeft(node
*X, node *Z) {
2 // Z is by 2 higher
than its sibling
3 Y = left_child(Z); //
Inner child of Z
4 // Y is by 1 higher
than sibling
5 t3 = right_child(Y);
6 left_child(Z) = t3;
7 if (t3 != null)
8 parent(t3) = Z;
9 right_child(Y) = Z;
10 parent(Z) = Y;
11 t2 = left_child(Y);
12 right_child(X) = t2;
13 if (t2 != null)
14 parent(t2) = X;
15 left_child(Y) = X;
16 parent(X) = Y;
17 if (BalanceFactor(Y) >
0) { // t3 was higher
18 BalanceFactor(X) =
–1; // t1 now higher
19 BalanceFactor(Z) =
0;
20 } else
21 if
(BalanceFactor(Y) == 0) {
22
BalanceFactor(X) = 0;
23
BalanceFactor(Z) = 0;
24 } else {
25 // t2 was
higher
26
BalanceFactor(X) = 0;
27
BalanceFactor(Z) = +1; // t4
now higher
28 }
29 BalanceFactor(Y) = 0;
30 return Y; // return new
root of rotated subtree
31 }

Comparison to other structures


Both AVL trees and red–black (RB) trees are
self-balancing binary search trees and they
are related mathematically. Indeed, every AVL
tree can be colored red–black,[16] but there are
RB trees which are not AVL balanced. For
maintaining the AVL resp. RB tree's invariants,
rotations play an important role. In the worst
case, even without rotations, AVL or RB
insertions or deletions require O(log n)
inspections and/or updates to AVL balance
factors resp. RB colors. RB insertions and
deletions and AVL insertions require from zero
to three tail-recursive rotations and run in
amortized O(1) time,[17][18] thus equally
constant on average. AVL deletions requiring
O(log n) rotations in the worst case are also
O(1) on average. RB trees require storing one
bit of information (the color) in each node,
while AVL trees mostly use two bits for the
balance factor, although, when stored at the
children, one bit with meaning «lower than
sibling» suffices. The bigger difference
between the two data structures is their height
limit.

For a tree of size n ≥ 1


an AVL tree's height is at most

where   the golden

ratio,  
and 
.
an RB tree's height is at most
 .[19]

AVL trees are more rigidly balanced than RB


trees with an asymptotic relation AVL⁄RB≈0.720
of the maximal heights. For insertions and
deletions, Ben Pfaff shows in 79
measurements a relation of AVL⁄RB between
0.677 and 1.077 with median ≈0.947 and
geometric mean ≈0.910.[20]

See also
Trees
Tree rotation
WAVL tree
Red–black tree
Splay tree
Scapegoat tree
B-tree
T-tree
List of data structures

References
1. Eric Alexander. "AVL Trees" .
2. Sedgewick, Robert (1983). "Balanced
Trees". Algorithms . Addison-Wesley.
p. 199 . ISBN 0-201-06672-6.
3. Adelson-Velsky, Georgy; Landis, Evgenii
(1962). "An algorithm for the organization
of information". Proceedings of the USSR
Academy of Sciences (in Russian). 146:
263–266. English translation by Myron J.
Ricci in Soviet Mathematics - Doklady,
3:1259–1263, 1962.
4. Pfaff, Ben (June 2004). "Performance
Analysis of BSTs in System Software"
(PDF). Stanford University.
5. AVL trees are not weight-balanced?
(meaning: AVL trees are not μ-balanced?)
Thereby: A Binary Tree is called -
balanced, with , if for every
node , the inequality

holds and is minimal with this property.


is the number of nodes below the tree
with as root (including the root) and
is the left child node of .
6. Knuth, Donald E. (2000). Sorting and
searching (2. ed., 6. printing, newly
updated and rev. ed.). Boston [u.a.]:
Addison-Wesley. p. 459. ISBN 0-201-
89685-0.
7. Rajinikanth. "AVL Tree :: Data Structures" .
btechsmartclass.com. Retrieved
2018-03-09.
8. Knuth, Donald E. (2000). Sorting and
searching (2. ed., 6. printing, newly
updated and rev. ed.). Boston [u.a.]:
Addison-Wesley. p. 460. ISBN 0-201-
89685-0.
Knuth has internal nodes and external
nodes, the first ones correspond to the
article's key carrying nodes, whereas
Knuth's external nodes (which do not carry
a key) have no correspondence in the
article. Nevertheless Knuth's external
nodes increase the tree's height by 1 (see
Fig. 20), an incrementation which the
article does not follow. At the end with the
article's notion of height, the tree
consisting of the root only has height 0, so
that F0+2 – 1 = 1 is the number of its
nodes.
NB: .

9. Dixit, J. B. (2010). Mastering data


structures through 'C' language . New
Delhi, India: University Science Press, an
imprint of Laxmi Publications Pvt. Ltd.
ISBN 9789380386720. OCLC 939446542 .
10. Brass, Peter (2008). Advanced data
structures . Cambridge: Cambridge
University Press. ISBN 9780511438202.
OCLC 312435417 .
11. Hubbard, John Rast (2000). Schaum's
outline of theory and problems of data
structures with Java . New York: McGraw-
Hill. ISBN 0071378707. OCLC 48139308 .
12. Knuth, Donald E. (2000). Sorting and
searching (2. ed., 6. printing, newly
updated and rev. ed.). Boston [u.a.]:
Addison-Wesley. pp. 458–481.
ISBN 0201896850.
13. Pfaff, Ben (2004). An Introduction to
Binary Search Trees and Balanced Trees.
Free Software Foundation, Inc. pp. 107–
138.
14. Blelloch, Guy E.; Ferizovic, Daniel; Sun,
Yihan (2016), "Just join for parallel
ordered sets", Symposium on Parallel
Algorithms and Architectures , ACM,
pp. 253–264, arXiv:1602.02120 ,
doi:10.1145/2935764.2935768 ,
ISBN 978-1-4503-4210-0.
15. Thereby, rotations with case Balanced do
not occur with insertions.
16. Paul E. Black (2015-04-13). "AVL tree" .
Dictionary of Algorithms and Data
Structures. National Institute of Standards
and Technology. Retrieved 2016-07-02.
17. Mehlhorn & Sanders 2008, pp. 165, 158
18. Dinesh P. Mehta, Sartaj Sahni (Ed.)
Handbook of Data Structures and
Applications 10.4.2
19. Red–black tree#Proof of asymptotic
bounds
20. Ben Pfaff: Performance Analysis of BSTs
in System Software. Stanford University
2004.

Further reading
Donald Knuth. The Art of Computer
Programming, Volume 3: Sorting and
Searching, Third Edition. Addison-Wesley,
1997. ISBN 0-201-89685-0. Pages 458–475
of section 6.2.3: Balanced Trees.

External links

The Wikibook Algorithm Implementation


has a page on the topic of: AVL tree

Wikimedia Commons has media related


to AVL-trees.

 This article incorporates public domain


material from the NIST document: Black,
Paul E. "AVL Tree" . Dictionary of Algorithms
and Data Structures.
Retrieved from "https://en.wikipedia.org/w/index.php?
title=AVL_tree&oldid=939147493"

Last edited 21 hours ago by an anonymous user

Content is available under CC BY-SA 3.0 unless


otherwise noted.

You might also like