0% found this document useful (0 votes)
47 views

Vdocument - in - Advanced Algorithm Design and Analysis

The document discusses directed acyclic graphs (DAGs) and algorithms for working with them, including depth-first search (DFS), topological sorting, and finding minimum spanning trees (MSTs). It explains that a graph is acyclic if DFS finds no back edges, and describes algorithms for topological sorting of DAGs and finding MSTs using Prim's algorithm in O(ElogV) time.

Uploaded by

jutuquh
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
47 views

Vdocument - in - Advanced Algorithm Design and Analysis

The document discusses directed acyclic graphs (DAGs) and algorithms for working with them, including depth-first search (DFS), topological sorting, and finding minimum spanning trees (MSTs). It explains that a graph is acyclic if DFS finds no back edges, and describes algorithms for topological sorting of DAGs and finding MSTs using Prim's algorithm in O(ElogV) time.

Uploaded by

jutuquh
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 94

Advanced Algorithm Design

and Analysis

Jiaheng Lu
Renmin University of China
www.jiahenglu.net
Directed Acyclic Graphs
 A directed acyclic graph or DAG is a
directed graph with no directed cycles:
DFS and DAGs
 Argue that a directed graph G is acyclic iff a DFS of
G yields no back edges:
 Forward: if G is acyclic, will be no back edges
 Trivial: a back edge implies a cycle
 Backward: if no back edges, G is acyclic
 Argue contrapositive: G has a cycle   a back edge
 Let v be the vertex on the cycle first discovered, and u be the
predecessor of v on the cycle
 When v discovered, whole cycle is white
 Must visit everything reachable from v before returning from DFS-
Visit()
 So path from uv is yellowyellow, thus (u, v) is a back edge
Topological Sort
 Topological sort of a DAG:
 Linear ordering of all vertices in graph G
such that vertex u comes before vertex v
if edge (u, v)  G
 Real-world example: getting dressed
Getting Dressed
Underwear Socks
Watch
Pants Shoes
Shirt

Belt
Tie

Jacket
Getting Dressed
Underwear Socks
Watch
Pants Shoes
Shirt

Belt
Tie

Jacket

Socks Underwear Pants Shoes Watch Shirt Belt Tie Jacket


Topological Sort Algorithm
Topological-Sort()
{
Run DFS
When a vertex is finished, output it
Vertices are output in reverse topological
order
}
 Time: O(V+E)
 Correctness: Want to prove that
(u,v)  G  uf > vf
Correctness of Topological
Sort
 Claim: (u,v)  G  uf > vf
 When (u,v) is explored, u is yellow
 v = yellow  (u,v) is back edge.
Contradiction (Why?)
 v = white  v becomes descendent of u 
vf < uf
(since must finish v before backtracking and
finishing u)
 v = black  v already finished  vf < uf
Minimum Spanning Tree
 Problem: given a connected,
undirected, weighted graph:
6 4
5 9

14 2
10
15

3 8
Minimum Spanning Tree
 Problem: given a connected,
undirected, weighted graph, find a
spanning tree using edges that
6 4
minimize the
5 total weight 9

14 2
10
15

3 8
Minimum Spanning Tree
 Which edges form the minimum
spanning tree (MST) of the below
graph? A
6 4
5 9
H B C

14 2
10
15
G E D
3 8
F
Minimum Spanning Tree
 Answer:

A
6 4
5 9
H B C

14 2
10
15
G E D
3 8
F
Minimum Spanning Tree
 MSTs satisfy the optimal substructure property: an
optimal tree is composed of optimal subtrees
 Let T be an MST of G with an edge (u,v) in the middle
 Removing (u,v) partitions T into two trees T1 and T2
 Claim: T1 is an MST of G1 = (V1,E1), and T2 is an MST of
G2 = (V2,E2) (Do V1 and V2 share vertices? Why?)
 Proof: w(T) = w(u,v) + w(T1) + w(T2)
(There can’t be a better tree than T1 or T2, or T would be
suboptimal)
Minimum Spanning Tree
 Thm:
 Let T be MST of G, and let A  T be
subtree of T
 Let (u,v) be min-weight edge connecting
A to V-A
 Then (u,v)  T
Minimum Spanning Tree
 Thm:
 Let T be MST of G, and let A  T be
subtree of T
 Let (u,v) be min-weight edge connecting
A to V-A
 Then (u,v)  T
Prim’s Algorithm
MST-Prim(G, w, r)
Q = V[G];
for each u  Q
key[u] = ;
key[r] = 0;
p[r] = NULL;
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6 4
5 9
MST-Prim(G, w, r)
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
key[r] = 0;
p[r] = NULL; 3 8
while (Q not empty) Run on example graph
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6  4
5 9
MST-Prim(G, w, r)   
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
  
key[r] = 0;
p[r] = NULL; 3  8
while (Q not empty) Run on example graph
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6  4
5 9
MST-Prim(G, w, r)   
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
r 0  
key[r] = 0;
p[r] = NULL; 3  8
while (Q not empty) Pick a start vertex r
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6  4
5 9
MST-Prim(G, w, r)   
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
u 0  
key[r] = 0;
p[r] = NULL; 3  8
while (Q not empty) Red vertices have been removed from Q
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6  4
5 9
MST-Prim(G, w, r)   
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
u 0  
key[r] = 0;
p[r] = NULL; 3 3 8
while (Q not empty) Red arrows indicate parent pointers
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6  4
5 9
MST-Prim(G, w, r) 14  
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
u 0  
key[r] = 0;
p[r] = NULL; 3 3 8
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6  4
5 9
MST-Prim(G, w, r) 14  
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
0  
key[r] = 0;
p[r] = NULL; 3 3 8
u
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6  4
5 9
MST-Prim(G, w, r) 14  
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
0 8 
key[r] = 0;
p[r] = NULL; 3 3 8
u
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6  4
5 9
MST-Prim(G, w, r) 10  
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
0 8 
key[r] = 0;
p[r] = NULL; 3 3 8
u
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6  4
5 9
MST-Prim(G, w, r) 10  
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
0 8 
key[r] = 0;
p[r] = NULL; 3 3 8
u
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6  4
5 9
MST-Prim(G, w, r) 10 2 
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
0 8 
key[r] = 0;
p[r] = NULL; 3 3 8
u
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6  4
5 9
MST-Prim(G, w, r) 10 2 
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
0 8 15
key[r] = 0;
p[r] = NULL; 3 3 8
u
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6  4
u
5 9
MST-Prim(G, w, r) 10 2 
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
0 8 15
key[r] = 0;
p[r] = NULL; 3 3 8
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6  4
u
5 9
MST-Prim(G, w, r) 10 2 9
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
0 8 15
key[r] = 0;
p[r] = NULL; 3 3 8
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6 4 4
u
5 9
MST-Prim(G, w, r) 10 2 9
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
0 8 15
key[r] = 0;
p[r] = NULL; 3 3 8
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6 4 4
u
5 9
MST-Prim(G, w, r) 5 2 9
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
0 8 15
key[r] = 0;
p[r] = NULL; 3 3 8
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm u
6 4 4
5 9
MST-Prim(G, w, r) 5 2 9
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
0 8 15
key[r] = 0;
p[r] = NULL; 3 3 8
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
u 6 4 4
5 9
MST-Prim(G, w, r) 5 2 9
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
0 8 15
key[r] = 0;
p[r] = NULL; 3 3 8
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
u
6 4 4
5 9
MST-Prim(G, w, r) 5 2 9
Q = V[G];
for each u  Q 14 2
10
key[u] = ; 15
0 8 15
key[r] = 0;
p[r] = NULL; 3 3 8
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Prim’s Algorithm
6 4 4
5 9
MST-Prim(G, w, r) 5 2 9
Q = V[G];
for each u  Q 14
10
2 u
key[u] = ; 15
0 8 15
key[r] = 0;
p[r] = NULL; 3 3 8
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Review: Prim’s Algorithm
MST-Prim(G, w, r)
Q = V[G];
for each u  Q
key[u] = ;
key[r] = 0;
What is the hidden cost in this code?
p[r] = NULL;
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Review: Prim’s Algorithm
MST-Prim(G, w, r)
Q = V[G];
for each u  Q
key[u] = ;
key[r] = 0;
p[r] = NULL;
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
DecreaseKey(v, w(u,v));
Review: Prim’s Algorithm
MST-Prim(G, w, r)
Q = V[G];
for each u  Q How often is ExtractMin() called?
key[u] = ;
key[r] = 0;
How often is DecreaseKey() called?
p[r] = NULL;
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
DecreaseKey(v, w(u,v));
Review: Prim’s Algorithm
MST-Prim(G, w, r)
Q = V[G]; What will be the running time?
for each u  Q A: Depends on queue
key[u] = ; binary heap: O(E lg V)
key[r] = 0;
p[r] = NULL;
Fibonacci heap: O(V lg V + E)
while (Q not empty)
u = ExtractMin(Q);
for each v  Adj[u]
if (v  Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Single-Source Shortest Path
 Problem: given a weighted directed
graph G, find the minimum-weight path
from a given source vertex s to
another vertex v
 “Shortest-path” = minimum weight
 Weight of path is sum of edges
 E.g., a road map: what is the shortest
path from Chapel Hill to Charlottesville?
Binary Heap
A special kind of binary tree. It has two properties
that are not generally true for other trees:

Completeness
The tree is complete, which means that nodes are
added from top to bottom, left to right, without
leaving any spaces. A binary tree is completely full if
it is of height, h, and has 2h+1-1 nodes.

Heapness
The item in the tree with the highest priority is at the
top of the tree, and the same is true for every
subtree.
Binary Heap
Binary tree of height, h, is complete iff
it is empty
or
its left subtree is complete of height h-1
and its right subtree is completely full of
height h-2
or
its left subtree is completely full of height
h-1 and its right subtree is complete of
height h-1.
Binary Heap
In simple terms:

 A heap is a binary tree in which every


parent is greater than its child(ren).

 A Reverse Heap is one in which the rule


is “every parent is less than the
child(ren)”.
Binary Heap
To build a heap, data is placed in the tree as it arrives.
Algorithm:

1. add a new node at the next leaf position


2. use this new node as the current position
3. While new data is greater than that in the
parent of the current node:
● move the parent down to the current node
● make the parent (now vacant) the current node
● Place data in current node
Binary Heap
New nodes are always added on the deepest level
in an orderly way.

The tree never becomes unbalanced.

There is no particular relationship among the data items


in the nodes on any given level, even the ones that have
the same parent

A heap is not a sorted structure. Can be regarded as


partially ordered.

A given set of data can be formed into many different


heaps (depends on the order in which the data arrives.)
Binary Heap
Example:

Data arrives to be heaped in the order:


54, 87, 27, 67, 19, 31, 29, 18, 32, 56, 7, 12,
31
Binary Heap
54, 87, 27, 67, 19, 31, 29, 18, 32, 56, 7, 12, 31
54 54 87 87

87 54 54 27

87 87 87

54 27 67 27 67 27

67 54 54 19
Binary Heap
54, 87, 27, 67, 19, 31, 29, 18, 32, 56, 7, 12, 31
87 87

67 27 67 27

54 19 19 31
54
Binary Heap
54, 87, 27, 67, 19, 31, 29, 18, 32, 56, 7, 12, 31
87
87

67 31 67 31

54 19 27 29
54 19 27 29

18 32

etc…
Binary Heap
To delete an element from the heap:

Algorithm:
If node is the last “logical node” in the tree,
simply delete it
Else:
● Replace the node with the last “logical
node” in the tree
● Delete the last logical node from the tree
● Re-heapify
Binary Heap
Example: Deleting the root node, T
Binary Heap
Binary Heap
Binary Heap
 Min-Max Heap
 Deaps
 Binomial Heaps
 Fibonacci Heaps
MIN-MAX Heaps (1/10)
 Definition
 A double-ended priority queue is a data structure that
supports the following operations:
 Insert an element with arbitrary key
 Delete an element with the largest key
 Delete an element with the smallest key
 Min heap or Max heap:
 Only insertion and one of the two deletion operations are
supported
 Min-Max heap:
 Supports all of the operations just described.
 Definition:
 A mix-max heap is a complete binary tree such that if it is
not empty, each element has a field called key.
 Alternating levels of this tree are min levels and max levels,
respectively.
 Let x be any node in a min-max heap. If x is on a min (max)
level then the element in x has the minimum (maximum)
key from among
all elements in
the subtree with
root x. We call
this node a min
(max) node.
 Insertion into a min-max heap (at a “max” level)
 If it is smaller/greater than its father (a “min”), then it
must be smaller/greater than all “max”/“min” above. So
simply check the “min”/“max” ancestors
 There exists a similar approach at a “min” level
 Following the nodes the max node i to the root and insert
into its proper
place

item = 80
i=313
grandparent = 0
3

#define MAX_SIZE 100


#define FALSE 0
[1
#define TRUE 1
]
#define SWAP(x,y,t)
((t)=(x), (x)=(y), [2 [3 80
(y)=(t)) ] ]
typedef struct {
int key; [4 [5 [6 [7
/* other fields */ [8] [9 [10] [11 [12] [13 ]
}element; ] ] ] ] ] ] 40
element
 min_max_insert: Insert item into the min-max heap

item.key = 80
5
*n = 14
12
13 complexity: O(log n)
parent = 7
6

[1 7
5 min
]
[2 70 [3 40
80 max
] ]
[4 [5 [6 [7
] 30 ] 9 ] 10 ]
7 15 min

45 50 30
20 12 10 40 max
[8 [9 [10 [11 [12 [13 [14
 Deletion of min element
 If we wish to delete the element with the smallest key,
then this element is in the root.
 In general situation, we are to reinsert an element item
into a min-max-heap, heap, whose root is empty.
 We consider the two cases:
1. The root has no children
 Item is to be inserted into the root.
2. The root has at least one child.
 The smallest key in the min-max-heap is in one of the children
or grandchildren of the root. We determine the node k has the
smallest key.
 The following possibilities need to be considered:
a) item.key  heap[k].key
 No element in heap with key smaller than item.key
 Item may be inserted into the root.
b) item.key  heap[k].key, k is a child of the root
 Since k is a max node, it has no descendants with key
larger than heap[k].key. Hence, node k has no
descendants with key larger than item.key.
 heap[k] may be
moved to the
root and item
inserted into
node k.
c) item.key  heap[k].key,
k is a grandchild of the root
 In this case, heap[k] may be moved to the root, now
heap[k] is seen as presently empty.
 Let parent be the parent of k.
 If item.key  heap[parent].key, then interchange them.
This ensures that the max node parent contains the
largest key in the sub-heap with root parent.
 At this point, we are faced with the problem of inserting
item into the
sub-heap with
root k.
Therefore, we
repeat the above
process.
 delete_min: complexity: O(log n)
 Delete the minimum element from the min-max heap
*n = 12
11
i=5 1
last = 5
k = 11
5
parent = 2
temp.key =
x.key = 12

[1 79 [0 7
] ]
[2 70 [3 40
] ]
[4 [5 [6 [7
] 30 ] 912 ] 10 ] 15

45 50 30
20 12
[8 [9 [10 [11 [12
 Deletion of max element
1. Determine the children of the root which are located on
max-level, and find the larger one (node) which is the
largest one on the min-max heap
2. We would consider the node as the root of a max-min
heap
3. There exist a max-min heap
similar
approach
(deletion of
max element)
as we
mentioned
above
Deaps(1/8)
 Definition
 The root contains no element
 The left subtree is a min-heap
 The right subtree is a max-heap
 Constraint between the two trees:
 let i be any node in left subtree, j be the
corresponding node in the right subtree.
 if j not exists, let j corresponds to parent of i
 i.key <= j.key
Deaps(2/8)

log 2 n 1
 i = min_partner(n) = n2
log 2 n 1
 j = max_partner(n)
n2
=
 if j > heapsize j /= 2
Deaps Insert(3/8)
public void insert(int x) { } else {
int i; i = maxPartner(n);
if (++n == 2) { if (x > deap[i]) {
deap[2] = x; return; } deap[n] = deap[i];
if (inMaxHeap(n)) { maxInsert(i, x);
i = minPartner(n); } else minInsert(n, x);
if (x < deap[i]) { }
deap[n] = deap[i]; }
minInsert(i, x);
} else maxInsert(n, x);
Deaps Insert(3/8)
public void insert(int x) {
int i;
if (++n == 2) {
deap[2] = x; return; }
if (inMaxHeap(n)) {
i = minPartner(n);
if (x < deap[i]) {
deap[n] = deap[i];
minInsert(i, x);
} else maxInsert(n, x);
Deaps Insert(3/8)
http://www.csie.ntnu.edu.tw/~swanky/ds/ch
ap9.htm

Insert and delete algorithm


Deaps(4/8)
 Insertion Into A
Deap
Deaps(5/8)
Deaps(6/8)
Deaps delete min(7/8)
public int deleteMin() { // try to put x at leaf i
int i, j, key = deap[2], x = j = maxPartner(i);
deap[n--]; if (x > deap[j]) {
// move smaller child to i deap[i] = deap[j];
for (i = 2; 2*i <= n; maxInsert(j, x);
deap[i] = deap[j], i = j) { } else {
j = i * 2; minInsert(i, x);
if (j+1 <= n && (deap[j] }
> deap[j+1]) j++; return key;
} }
Deaps(8/8)
Binomial Heaps(1/10)
 Cost Amortization( 分期還款 )
 actual cost of delete in Binomial Heap could be
O(n), but insert and combine are O(1)
 cost amortization charge some cost of a heavy
operation to lightweight operations
 amortized Binomial Heap delete is O(log2n)
 A tighter bound could be achieved for a
sequence of operations
 actual cost of any sequence of i inserts, c
combines, and dm delete in Binomial Heaps is
O(i+c+dmlogi)
Binomial Heaps(2/10)
 Definition of Binomial Heap
 Node: degree, child ,left_link, right_link, data, parent
 roots are doubly linked
 a points to smallest root
Binomial Heaps(3/10)
Binomial Heaps(4/10)
 Insertion Into A Binomial Heaps
 make a new node into doubly linked
circular list pointed at by a
 set a to the root with smallest key
 Combine two B-heaps a and b
 combine two doubly linked circular lists to
one
 set a to the root with smallest key
Binomial Heaps(5/10)
 Deletion Of Min Element
Binomial Heaps(6/10)
Binomial Heaps(7/10)
Binomial Heaps(8/10)
Binomial Heaps(9/10)
Binomial Heaps(10/10)
 Trees in B-Heaps is Binomial tree
 B0 has exactly one node
 Bk, k > 0, consists of a root with degree k
and whose subtrees are B0, B1, …, Bk-1
 Bk has exactly 2k nodes
 actual cost of a delete is O(logn + s)
 s = number of min-trees in a (original roots -
1) and y (children of the removed node)
Fibonacci Heaps(1/8)
 Definition
 delete, delete the element in a specified
node
 decrease key
 This two operations are followed by
cascading cut
Fibonacci Heaps(2/8)
 Deletion From An F-heap
 min or not min
Fibonacci Heaps(3/8)
 Decrease Key
 if not min, and smaller than parent, then delete
Fibonacci Heap(4/8)
 To prevent the amortized cost of delete
min becomes O(n), each node can have
only one child be deleted.
 If two children of x were deleted, then x
must be cut and moved to the ring of
roots.
 Using a flag (true of false) to indicate
whether one of x’s child has been cut
Fibonacci Heaps(5/8)
Cascading Cut
Fibonacci Heaps(6/8)
 Lemma
 the ith child of any node x in a F-Heap has a
degree of at least i – 2, except when i=1 the
degree is 0
 Corollary
 Let Sk be the minimum possible number of
descendants of a node of degree k, then
S0=1, S1=2. From the lemma above, we got
k 2
Sk   Si  2 (2 comes from 1st child
i 0
and root)
Fibonacci Heaps(7/8)
k
Fk  2   Fi  2
i2

S k  Fk  2
 That’s why the data structure is called
Fibonacci Heap
Fibonacci Heaps(8/8)
 Application Of F-heaps

You might also like