AVL tree

From Infogalactic: the planetary knowledge core
(Redirected from AVL Tree)
Jump to: navigation, search
AVL tree
Type Tree
Invented 1962
Invented by Georgy Adelson-Velsky and Evgenii Landis
Time complexity
in big O notation
Average Worst case
Space O(n) O(n)
Search O(log n)[1] O(log n)[1]
Insert O(log n)[1] O(log n)[1]
Delete O(log n)[1] O(log n)[1]
Example AVL tree

In computer science, an AVL tree (Georgy Adelson-Velsky and Evgenii Landis' tree, named after the inventors) is a self-balancing binary search tree. It was the first such data structure to be invented.[2] In an AVL tree, the heights of the two child subtrees of any node differ by at most one; if at any time they differ by more than one, rebalancing is done to restore this property. Lookup, insertion, and deletion all take O(log n) time in both the average and worst cases, where n is the number of nodes in the tree prior to the operation. Insertions and deletions may require the tree to be rebalanced by one or more tree rotations.

The AVL tree is named after its two Soviet inventors, Georgy Adelson-Velsky and Evgenii Landis, who published it in their 1962 paper "An algorithm for the organization of information".[3]

AVL trees are often compared with red-black trees because both support the same set of operations and take O(log n) time for the basic operations. For lookup-intensive applications, AVL trees are faster than red-black trees because they are more rigidly balanced.[4] Similar to red-black trees, AVL trees are height-balanced. Both are in general not weight-balanced nor μ-balanced for any \scriptstyle \mu\leq\tfrac12;[5] that is, sibling nodes can have hugely differing numbers of descendants.

Operations

Tree rotations

Basic operations of an AVL tree involve carrying out the same actions as would be carried out on an unbalanced binary search tree, but modifications are followed by zero or more operations called tree rotations, which help to restore the height balance of the subtrees.

Searching

Searching for a specific key in an AVL tree can be done the same way as that of a normal unbalanced binary search tree.

Traversal

Once a node has been found in a balanced tree, the next or previous nodes can be explored in amortized constant time. Some instances of exploring these "nearby" nodes require traversing up to log(n) links (particularly when moving from the rightmost leaf of the root's left subtree to the root or from the root to the leftmost leaf of the root's right subtree; in the example AVL tree, moving from node 14 to the next but one node 19 takes 4 steps). However, exploring all n nodes of the tree in this manner would use each link exactly twice: one traversal to enter the subtree rooted at that node, another to leave that node's subtree after having explored it. And since there are n−1 links in any tree, the amortized cost is found to be 2×(n−1)/n, or approximately 2.

Insertion

After inserting a node, it is necessary to check each of the node's ancestors for consistency with the invariants of AVL trees: this is called "retracing". This is achieved by considering the balance factor of each node, which is defined as the difference in heights of the left and right subtrees.

Pictorial description of how rotations rebalance a node in AVL tree. The numbered circles represent the nodes being rebalanced. The lettered triangles represent subtrees which are themselves balanced AVL trees. A blue number next to a node denotes possible balance factors (those in parentheses occurring only in case of deletion).

Thus the balance factor of any node of an AVL tree is in the integer range [-1,+1]. This balance factor is stored in the node, but may have to be corrected after an insertion or a deletion, which is also done during retracing. Since with a single insertion the height of an AVL subtree cannot increase by more than one, the temporarily recomputed balance factor of a node after an insertion will be in the range [−2,+2]. For each node checked, if the recomputed balance factor remains in the range [−1,+1] no rotations are necessary. However, if the recomputed balance factor becomes less than −1 or greater than +1, the subtree rooted at this node is unbalanced, and a rotation is needed.

Description of the Rotations

Let us first assume the balance factor of a node P is 2 (as opposed to the other possible unbalanced value −2). This case is depicted in the left column of the illustration with P:=5. We then look at the left subtree (the higher one) with root N. If this subtree does not lean to the right - i.e. N has balance factor 1 (or, when deletion also 0) - we can rotate the whole tree to the right to get a balanced tree. This is labelled as the "Left Left Case" in the illustration with N:=4. If the subtree does lean to the right - i.e. N:=3 has balance factor −1 - we first rotate the subtree to the left and end up the previous case. This second case is labelled as "Left Right Case" in the illustration.

If the balance factor of the node P is −2 (this case is depicted in the right column of the illustration P:=3) we can mirror the above algorithm. I.e. if the root N of the (higher) right subtree has balance factor −1 (or, when deletion also 0) we can rotate the whole tree to the left to get a balanced tree. This is labelled as the "Right Right Case" in the illustration with N:=4. If the root N:=5 of the right subtree has balance factor 1 ("Right Left Case") we can rotate the subtree to the right to end up in the "Right Right Case".

The whole retracing loop for an insertion looks like this:

 // N is the child of P whose height increases by 1.
 do {
   // balance_factor(P) has not yet been updated!
   if (N == left_child(P)) { // the left subtree increases
     if (balance_factor(P) == 1) { // The left column in the picture
       // ==> the temporary balance_factor(P) == 2 ==> rebalancing is required.
       if (balance_factor(N) == -1) { // Left Right Case
          rotate_left(N); // Reduce to Left Left Case
       }
       // Left Left Case
       rotate_right(P);
       break; // Leave the loop
     }
     if (balance_factor(P) == -1) {
       balance_factor(P) = 0; // N’s height increase is absorbed at P.
       break; // Leave the loop
     }
     balance_factor(P) = 1; // Height increases at P
   } else { // N == right_child(P), the child whose height increases by 1.
     if (balance_factor(P) == -1) { // The right column in the picture
       // ==> the temporary balance_factor(P) == -2 ==> rebalancing is required.
       if (balance_factor(N) == 1) { // Right Left Case
          rotate_right(N); // Reduce to Right Right Case
       }
       // Right Right Case
       rotate_left(P);
       break; // Leave the loop
     }
     if (balance_factor(P) == 1) {
       balance_factor(P) = 0; // N’s height increase is absorbed at P.
       break; // Leave the loop
     }
     balance_factor(P) = -1; // Height increases at P
   }
   N = P;
   P = parent(N);
 } while (P != null); // Possibly up to the root

After a rotation a subtree has the same height as before, so retracing can stop. In order to restore the balance factors of all nodes, first observe that all nodes requiring correction lie along the path used during the initial insertion. If the above procedure is applied to nodes along this path, starting from the bottom (i.e. the inserted node), then every node in the tree will again have a balance factor of −1, 0, or 1.

The time required is O(log n) for lookup, plus a maximum of O(log n) retracing levels on the way back to the root, so the operation can be completed in O(log n) time.

Deletion

Let node X be the node with the value we need to delete, and let node Y be a node in the tree we need to find to take node X's place, and let node Z be the actual node we take out of the tree.

File:Binary search tree delete.svg
Deleting a node with two children from a binary search tree using the in-order predecessor (rightmost node in the left subtree, labelled 6).

Steps to consider when deleting a node in an AVL tree are the following:

  1. If node X is a leaf or has only one child, skip to step 5 with Z:=X.
  2. Otherwise, determine node Y by finding the largest[citation needed] node in node X's left subtree (the in-order predecessor of X − it does not have a right child) or the smallest in its right subtree (the in-order successor of X − it does not have a left child).
  3. Exchange all the child and parent links of node X with those of node Y. In this step, the in-order sequence between nodes X and Y is temporarily disturbed, but the tree structure doesn't change.
  4. Choose node Z to be all the child and parent links of old node Y = those of new node X.
  5. If node Z has a subtree (which then is a leaf), attach it to Z's parent.
  6. If node Z was the root (its parent is null), update root.
  7. Delete node Z.
  8. Retrace the path back up the tree (starting with node Z's parent) to the root, adjusting the balance factors as needed.

Since with a single deletion the height of an AVL subtree cannot decrease by more than one, the temporary balance factor of a node will be in the range from −2 to +2.

If the balance factor becomes ±2 then the subtree is unbalanced and needs to be rotated. The various cases of rotations are depicted in section "Insertion".

The whole retracing loop for a deletion looks like this:

 // N is the child of P whose height decreases by 1.
 do {
   // balance_factor(P) has not yet been updated!
   if (N == right_child(P)) { // the right subtree decreases
     if (balance_factor(P) == 1) { // The left column in the picture
       // ==> the temporary balance_factor(P) == 2 ==> rebalancing is required.
       S = left_child(P); // Sibling of N
       B = balance_factor(S);
       if (B == -1) { // Left Right Case
          rotate_left(S); // Reduce to Left Left Case
       }
       // Left Left Case
       rotate_right(P);
       if (B == 0) // (in the picture the small blue (0) at node 4)
         break; // Height does not change: Leave the loop
     }
     if (balance_factor(P) == 0) {
       balance_factor(P) = 1; // N’s height decrease is absorbed at P.
       break; // Leave the loop
     }
     balance_factor(P) = 0; // Height decreases at P
   } else { // N == left_child(P), the child whose height decreases by 1.
     if (balance_factor(P) == -1) { // The right column in the picture
       // ==> the temporary balance_factor(P) == -2 ==> rebalancing is required.
       S = right_child(P); // Sibling of N
       B = balance_factor(S);
       if (B == 1) { // Right Left Case
          rotate_right(S); // Reduce to Right Right Case
       }
       // Right Right Case
       rotate_left(P);
       if (B == 0) // (in the picture the small blue (0) at node 4)
         break; // Height does not change: Leave the loop
     }
     if (balance_factor(P) == 0) {
       balance_factor(P) = -1; // N’s height decrease is absorbed at P.
       break; // Leave the loop
     }
     balance_factor(P) = 0; // Height decreases at P
   }
   N = P;
   P = parent(N);
 } while (P != null); // Possibly up to the root

The retracing can stop if the balance factor becomes ±1 indicating that the height of that subtree has remained unchanged. This can also result from a rotation when the higher child tree has a balance factor of 0.

If the balance factor becomes 0 then the height of the subtree has decreased by one and the retracing needs to continue. This can also result from a rotation.

The time required is O(log n) for lookup, plus a maximum of O(log n) retracing levels on the way back to the root, so the operation can be completed in O(log n) time.

Comparison to other structures

Both AVL trees and red-black trees are self-balancing binary search trees and they are very similar mathematically.[6] The operations to balance the trees are different, but both occur on the average in O(1) with maximum in O(log n). The real difference between the two is the limiting height. For a tree of size  n :

  • An AVL tree's height is strictly less than:[7][8]
    \log_\varphi(\sqrt 5 (n+2)) - 2 = { \log_2(\sqrt 5 (n+2)) \over \log_2(\varphi) } - 2 = \log_\varphi(2) \cdot \log_2(\sqrt 5 (n+2)) - 2 \approx 1.44\log_2(n+2) - 0.328
    where \varphi is the golden ratio.
  • A red-black tree's height is at most 2\log_2(n+1)[9]

AVL trees are more rigidly balanced than red-black trees, leading to slower insertion and removal but faster retrieval.

See also

References

  1. 1.0 1.1 1.2 1.3 1.4 1.5 Lua error in package.lua at line 80: module 'strict' not found.
  2. Robert Sedgewick, Algorithms, Addison-Wesley, 1983, ISBN 0-201-06672-6, page 199, chapter 15: Balanced Trees.
  3. Lua error in package.lua at line 80: module 'strict' not found. English translation by Myron J. Ricci in Soviet Math. Doklady, 3:1259–1263, 1962.
  4. Lua error in package.lua at line 80: module 'strict' not found.
  5. AVL trees are not weight-balanced? (meaning: AVL trees are not μ-balanced?)
    Thereby: A Binary Tree is called \mu-balanced, with 0 \le\mu\leq\tfrac12, if for every node N, the inequality
    \tfrac12-\mu\le\tfrac{|N_l|}{|N|+1}\le \tfrac12+\mu
    holds and \mu is minimal with this property. |N| is the number of nodes below the tree with N as root (including the root) and N_l is the left child node of N.
  6. In fact, each AVL tree can be colored red-black.
  7. Lua error in package.lua at line 80: module 'strict' not found.
  8. Lua error in package.lua at line 80: module 'strict' not found.
  9. Proof of asymptotic bounds

Further reading

External links