Unit 1
Unit 1
Fundamentals of
Algorithms
Lecture #1
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
What is an algorithm?
problem
algorithm
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Algorithm
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Notion of algorithm and problem
problem
algorithm
algorithmic solution
(different from a conventional solution)
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Example of computational problem: sorting
Statement of problem:
• Input: A sequence of n numbers <a1, a2, …, an>
• Output: A reordering of the input sequence <a´1, a´2, …, a´n> so that a´i
≤ a´j whenever i < j
Algorithms:
• Selection sort
• Insertion sort
• Merge sort
• (many others)
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Selection Sort
Algorithm:
for i=1 to n
swap a[i] with smallest of a[i],…,a[n]
Sorting
Searching
Shortest paths in a graph
Minimum spanning tree
Primality testing
Traveling salesman problem
Knapsack problem
Chess
Towers of Hanoi
Program termination
Proving correctness
• Empirical analysis
Optimality
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Algorithm design strategies
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Analysis of Algorithms
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
What is an algorithm?
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Why study algorithms?
Theoretical importance
Practical importance
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Euclid’s Algorithm
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Two descriptions of Euclid’s algorithm
while n ≠ 0 do
r ← m mod n
m← n
n←r
return m
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Other methods for computing gcd(m,n)
Middle-school procedure
Step 1 Find the prime factorization of m
Step 2 Find the prime factorization of n
Step 3 Find all the common prime factors
Step 4 Compute the product of all the common prime factors
and return it as gcd(m,n)
Is this an algorithm?
Example: 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Time complexity: O(n)
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Fundamentals of Algorithmic Problem Solving
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Two main issues related to algorithms
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Algorithm design techniques/strategies
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Analysis of algorithms
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Important problem types
sorting
searching
string processing
graph problems
combinatorial problems
geometric problems
numerical problems
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Sorting (I)
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Sorting (II)
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Selection Sort
Algorithm SelectionSort(A[0..n-1])
//The algorithm sorts a given array by selection sort
//Input: An array A[0..n-1] of orderable elements
//Output: Array A[0..n-1] sorted in ascending order
for i 0 to n – 2 do
min i
for j i + 1 to n – 1 do
if A[j] < A[min]
min j
swap A[i] and A[min]
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Searching
Time: O(log n)
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
String Processing
Examples:
(i) searching for a word or phrase on WWW or in a
Word document
(ii) searching for a short read in the reference genomic
sequence
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
CSE408
Fundamentals of Data
Structure
Lecture #2
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Fundamental data structures
list graph
• string
stack
queue
priority queue/heap
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Linear Data Structures
Arrays ◼ Arrays
• A sequence of n items of the same data ◼ fixed length (need preliminary
type that are stored contiguously in reservation of memory)
computer memory and made accessible ◼ contiguous memory locations
by specifying a value of the array’s
index. ◼ direct access
Linked List ◼ Insert/delete
• A sequence of zero or more nodes each◼ Linked Lists
containing two kinds of information:
some data and one or more links called ◼ dynamic length
pointers to other nodes of the linked ◼ arbitrary memory locations
list. ◼ access by following links
• Singly linked list (next pointer)
◼ Insert/delete
• Doubly linked list (next + previous
pointers)
a1 a2 … an .
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Stacks and Queues
Stacks
• A stack of plates
– insertion/deletion can be done only at the top.
– LIFO
• Two operations (push and pop)
Queues
• A queue of customers waiting for services
– Insertion/enqueue from the rear and deletion/dequeue from the
front.
– FIFO
• Two operations (enqueue and dequeue)
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Priority Queue and Heap
9 6 8 5 2 3
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Graphs
Formal definition
• A graph G = <V, E> is defined by a pair of two sets: a
finite set V of items called vertices and a set E of vertex
pairs called edges.
Undirected and directed graphs (digraphs).
What’s the maximum number of edges in an undirected graph
with |V| vertices?
Complete, dense, and sparse graphs
• A graph with every pair of its vertices connected by an edge
is called complete, K|V|
1 2
3 4
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Graph Representation
Adjacency matrix
• n x n boolean matrix if |V| is n.
• The element on the ith row and jth column is 1 if there’s an edge
from ith vertex to the jth vertex; otherwise 0.
• The adjacency matrix of an undirected graph is symmetric.
Adjacency linked lists
• A collection of linked lists, one for each vertex, that contain all the
vertices adjacent to the list’s vertex.
Which data structure would you use if the graph is a 100-node star
shape?
0111 2 3 4
0001 4
0001 4
0000
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Weighted Graphs
Weighted graphs
• Graphs or digraphs with numbers assigned to the edges.
5
1 2
6 7
9
3 4
8
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Graph Properties -- Paths and Connectivity
Paths
• A path from vertex u to v of a graph G is defined as a sequence of
adjacent (connected by an edge) vertices that starts with u and ends with
v.
• Simple paths: All edges of a path are distinct.
• Path lengths: the number of edges, or the number of vertices – 1.
Connected graphs
• A graph is said to be connected if for every pair of its vertices u and v
there is a path from u to v.
Connected component
• The maximum connected subgraph of a given graph.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Graph Properties -- Acyclicity
Cycle
• A simple path of a positive length that starts and ends a
the same vertex.
Acyclic graph
• A graph without cycles
• DAG (Directed Acyclic Graph)
1 2
3 4
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Trees
Trees
• A tree (or free tree) is a connected acyclic graph.
• Forest: a graph that has no cycles but is not necessarily connected.
Properties of trees
• For every two vertices in a tree there always exists exactly one simple
path from one of these vertices to the other. Why?
– Rooted trees: The above property makes it possible to select an
arbitrary vertex in a free tree and consider it as the root of the so
called rooted tree.
– Levels in a rooted tree.
rooted
3
◼ |E| = |V| - 1 1 3 5
4 1 5
2 4
2
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Rooted Trees (I)
Ancestors
• For any vertex v in a tree T, all the vertices on the simple path
from the root to that vertex are called ancestors.
Descendants
• All the vertices for which a vertex v is an ancestor are said to be
descendants of v.
Parent, child and siblings
• If (u, v) is the last edge of the simple path from the root to
vertex v, u is said to be the parent of v and v is called a child of
u.
• Vertices that have the same parent are called siblings.
Leaves
• A vertex without children is called a leaf.
Subtree
• A vertex v with all its descendants is called the subtree of T
rooted at v.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Rooted Trees (II)
Depth of a vertex
• The length of the simple path from the root to the vertex.
Height of a tree
• The length of the longest simple path from the root to a leaf.
h=2
3
4 1 5
2
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Ordered Trees
Ordered trees
• An ordered tree is a rooted tree in which all the children of each vertex
are ordered.
Binary trees
• A binary tree is an ordered tree in which every vertex has no more than
two children and each children is designated s either a left child or a right
child of its parent.
Binary search trees
• Each vertex is assigned a number.
• A number assigned to each parental vertex is larger than all the numbers
in its left subtree and smaller than all the numbers in its right subtree.
log2n h n – 1, where h is the height of a binary tree and n the size.
9 6
6 8 3 9
5 2 3 2 5 8
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
CSE408
String Matching
Algorithm
Lecture # 5&6
String Matching Problem
Motivations: text-editing, pattern matching in DNA sequences
32.1
• Naive Algorithm
– Worst-case running time in O((n-m+1) m)
• Rabin-Karp
– Worst-case running time in O((n-m+1) m)
– Better than this on average and in practice
• Knuth-Morris-Pratt
– Worst-case running time in O(n + m)
Notation & Terminology
32.4
Rabin-Karp Algorithm
32.5
Rabin-Karp Algorithm (continued)
p = 31415
spurious
hit
Rabin-Karp Algorithm (continued)
Rabin-Karp Algorithm (continued)
d is radix q is modulus
Preprocessing
Q(m)
Preprocessing
Q(m)
Matching loop invariant: when line 10 executed
ts=T[s+1..s+m] mod q
Q((n-m+1)m) rule out spurious hit
Q(m)
Try all
possible
shifts
Compute-Prefix-Function (p)
1 m length[p] //’p’ pattern to be matched
2 Π[1] 0
3 k0
4 for q 2 to m
5 do while k > 0 and p[k+1] != p[q]
6 do k Π[k]
7 If p[k+1] = p[q]
8 then k k +1
9 Π[q] k
10 return Π
Example: compute Π for the pattern ‘p’ below:
P a b a b a c a
Initially: m = length[p] = 7
Π[1] = 0
k=0
q 1 2 3 4 5 6 7
p a b a b a c a
Step 1: q = 2, k=0 Π 0 0
Π[2] = 0
q 1 2 3 4 5 6 7
p a b a b a c a
Π 0 0 1
Step 2: q = 3, k = 0,
q 1 2 3 4 5 6 7
Π[3] = 1
p a b a b a c A
Π 0 0 1 2
Step 4: q = 5, k =2 q 1 2 3 4 5 6 7
Π[5] = 3 p a b a b a c a
Π 0 0 1 2 3
q 1 2 3 4 5 6 7
Step 5: q = 6, k = 3
Π[6] = 1 p a b a b a c a
Π 0 0 1 2 3 1
q 1 2 3 4 5 6 7
Step 6: q = 7, k = 1 p a b a b a c a
Π[7] = 1 Π 0 0 1 2 3 1 1
The KMP Matcher, with pattern ‘p’, string ‘S’ and prefix function ‘Π’ as input, finds a match of p in S.
Following pseudocode computes the matching component of KMP algorithm:
KMP-Matcher(S,p)
1 n length[S]
2 m length[p]
3 Π Compute-Prefix-Function(p)
4q0 //number of characters matched
5 for i 1 to n //scan S from left to right
6 do while q > 0 and p[q+1] != S[i]
7 do q Π[q] //next character does not match
8 if p[q+1] = S[i]
9 then q q + 1 //next character matches
10 if q = m //is all of p matched?
11 then print “Pattern occurs with shift” i – m
12 q Π[ q] // look for the next match
Note: KMP finds every occurrence of a ‘p’ in ‘S’. That is why KMP does not terminate in step 12, rather it
searches remainder of ‘S’ for any more occurrences of ‘p’.
Illustration: given a String ‘S’ and pattern ‘p’ as follows:
b a c b a b a b a b a c a c a
S
p a b a b a c a
Let us execute the KMP algorithm to find
whether ‘p’ occurs in ‘S’.
For ‘p’ the prefix function, Π was computed previously and is as follows:
q 1 2 3 4 5 6 7
p a b A b a c a
Π 0 0 1 2 3 1 1
Initially: n = size of S = 15;
m = size of p = 7
Step 1: i = 1, q = 0
comparing p[1] with S[1]
S b a c b a b a b a b a c a a b
p a b a b a c a
P[1] does not match with S[1]. ‘p’ will be shifted one position to the right.
Step 2: i = 2, q = 0
comparing p[1] with S[2]
S b a c b a b a b a b a c a a b
p a b a b a c a
P[1] matches S[2]. Since there is a match, p is not shifted.
Step 3: i = 3, q = 1
Comparing p[2] with S[3] p[2] does not match with S[3]
S b a c b a b a b a b a c a a b
p a b a b a c a
Backtracking on p, comparing p[1] and S[3]
Step 4: i = 4, q = 0
comparing p[1] with S[4] p[1] does not match with S[4]
S b a c b a b a b a b a c a a b
p a b a b a c a
Step 5: i = 5, q = 0
comparing p[1] with S[5] p[1] matches with S[5]
S b a c b a b a b a b a c a a b
p a b a b a c a
Step 6: i = 6, q = 1
Comparing p[2] with S[6] p[2] matches with S[6]
S b a c b a b a b a b a c a a b
p a b a b a c a
Step 7: i = 7, q = 2
Comparing p[3] with S[7] p[3] matches with S[7]
S b a c b a b a b a b a c a a b
p a b a b a c a
Step 8: i = 8, q = 3
Comparing p[4] with S[8] p[4] matches with S[8]
S b a c b a b a b a b a c a a b
p a b a b a c a
Step 9: i = 9, q = 4
Comparing p[5] with S[9] p[5] matches with S[9]
S b a c b a b a b a b a c a a b
p a b a b a c a
S b a c b a b a b a b a c a a b
p a b a b a c a
Backtracking on p, comparing p[4] with S[10] because after mismatch q = Π[5] = 3
S b a c b a b a b a b a c a a b
p a b a b a c a
Step 12: i = 12, q = 5
Comparing p[6] with S[12] p[6] matches with S[12]
S b a c b a b a b a b a c a a b
p a b a b a c a
S b a c b a b a b a b a c a a b
p a b a b a c a
Pattern ‘p’ has been found to completely occur in string ‘S’. The total number of shifts
that took place for the match to be found are: i – m = 13 – 7 = 6 shifts.
Running - time analysis
In the above pseudocode for computing the prefix The for loop beginning in step 5 runs ‘n’ times, i.e., as
function, the for loop from step 4 to step 10 long as the length of the string ‘S’. Since step 1
runs ‘m’ times. Step 1 to step 3 take to step 4 take constant time, the running time is
constant time. Hence the running time of dominated by this for loop. Thus running time of
compute prefix function is Θ(m). matching function is Θ(n).
Knuth-Morris-Pratt Algorithm
Q(m) in Q(n)
# characters matched
using
amortized scan text left-to-right
analysis
Q(m+n)
next character does not match
Q(n)
next character matches
Is all of P matched?
using
amortized
analysis Look for next match
Knuth-Morris-Pratt Algorithm
Lecture # 7&8
Brute Force
Examples:
1. Computing an (a > 0, n a nonnegative integer)
2. Computing n!
Example: 7 3 2 5
Analysis of Selection Sort
Stability: yes
Brute-Force String Matching
• pattern: a string of m characters to search for
• text: a (longer) string of n characters to search in
• problem: find a substring in the text that matches the pattern
Brute-force algorithm
Step 1 Align pattern at beginning of text
Step 2 Moving from left to right, compare each character of
pattern to the corresponding character in text until
• all characters are found to match (successful search); or
• a mismatch is detected
Step 3 While pattern is not found and the text is not yet
exhausted, realign pattern one position to the right and
repeat Step 2
Examples of Brute-Force String Matching
1. Pattern: 001011
Text: 10010101101001100101111010
2. Pattern: happy
Text: It is never too late to have a
happy childhood.
Pseudocode and Efficiency
Brute-force algorithm
Compute the distance between every pair of
distinct points
and return the indexes of the points for which
the distance is the smallest.
Closest-Pair Brute-Force Algorithm (cont.)
• Weaknesses
– rarely yields efficient algorithms
– some brute-force algorithms are unacceptably slow
– not as constructive as some other design techniques
Exhaustive Search
A brute force solution to a problem involving
search for an element with a special property,
usually among combinatorial objects such as
permutations, combinations, or subsets of a set.
Method:
– generate a list of all potential solutions to the problem in a
systematic manner (see algorithms in Sec. 5.4)
9 2 7 8
6 4 3 7
C=
5 8 1 8
7 6 9 4
Lecture #3
Analysis of algorithms
Issues:
• correctness
• time efficiency
• space efficiency
• optimality
Approaches:
• theoretical analysis
• empirical analysis
Theoretical analysis of time efficiency
T(n) ≈ copC(n)
running time execution time Number of times
for basic operation basic operation is
executed
Empirical analysis of time efficiency
Worst case
Best case
Average case
Example
Exact formula
e.g., C(n) = n(n-1)/2
Example:
• How much faster will algorithm run on computer that is twice
as fast?
• f(n) O(f(n))
Examples:
• 10n vs. n2
• n(n+1)/2 vs. n2
L’Hôpital’s rule and Stirling’s formula
• All polynomials of the same degree k belong to the same class: aknk
+ ak-1nk-1 + … + a0 (nk)
• order log n < order n (>0) < order an < order n! < order nn
Basic asymptotic efficiency classes
1 constant
log n logarithmic
n linear
n log n n-log-n
n2 quadratic
n3 cubic
2n exponential
n! factorial
CSE408
Fundamentals of Data
Structure
Lecture #2
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Fundamental data structures
• array graph
• string dictionary
stack
queue
priority queue/heap
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Linear Data Structures
Arrays ◼ Arrays
• A sequence of n items of the same data ◼ fixed length (need preliminary
type that are stored contiguously in reservation of memory)
computer memory and made accessible ◼ contiguous memory locations
by specifying a value of the array’s
index. ◼ direct access
Linked List ◼ Insert/delete
• A sequence of zero or more nodes each◼ Linked Lists
containing two kinds of information:
some data and one or more links called ◼ dynamic length
pointers to other nodes of the linked ◼ arbitrary memory locations
list. ◼ access by following links
• Singly linked list (next pointer)
◼ Insert/delete
• Doubly linked list (next + previous
pointers)
a1 a2 … an .
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Stacks and Queues
Stacks
• A stack of plates
– insertion/deletion can be done only at the top.
– LIFO
• Two operations (push and pop)
Queues
• A queue of customers waiting for services
– Insertion/enqueue from the rear and deletion/dequeue from the
front.
– FIFO
• Two operations (enqueue and dequeue)
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Graphs
Formal definition
• A graph G = <V, E> is defined by a pair of two sets: a
finite set V of items called vertices and a set E of vertex
pairs called edges.
Undirected and directed graphs (digraphs).
What’s the maximum number of edges in an undirected graph
with |V| vertices?
Complete, dense, and sparse graphs
• A graph with every pair of its vertices connected by an edge
is called complete, K|V|
• Dense graph is a graph in which the number of edges is
close to the maximal number of edges. Sparse graph is a
graph in which the number of edges is close to the minimal
number of edges.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Weighted Graphs
Weighted graphs
• Graphs or digraphs with numbers assigned to the edges.
5
1 2
6 7
9
3 4
8
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Graph Properties -- Paths and Connectivity
Paths
• A path from vertex u to v of a graph G is defined as a sequence of
adjacent (connected by an edge) vertices that starts with u and ends with
v.
• Simple paths: All edges of a path are distinct.
• Path lengths: the number of edges, or the number of vertices – 1.
Connected graphs
• A graph is said to be connected if for every pair of its vertices u and v
there is a path from u to v.
Connected component
• The maximum connected subgraph of a given graph.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Graph Properties -- Acyclicity
Cycle
• A simple path of a positive length that starts and ends a
the same vertex.
Acyclic graph
• A graph without cycles
• DAG (Directed Acyclic Graph)
1 2
3 4
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Trees
Trees
• A tree (or free tree) is a connected acyclic graph.
• Forest: a graph that has no cycles but is not necessarily connected.
Properties of trees
• For every two vertices in a tree there always exists exactly one simple
path from one of these vertices to the other. Why?
– Rooted trees: The above property makes it possible to select an
arbitrary vertex in a free tree and consider it as the root of the so
called rooted tree.
rooted
3
◼ |E| = |V| - 1 1 3 5
4 1 5
2 4
2
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Rooted Trees (I)
Ancestors
• For any vertex v in a tree T, all the vertices on the simple path
from the root to that vertex are called ancestors.
Descendants
• All the vertices for which a vertex v is an ancestor are said to be
descendants of v.
Parent, child and siblings
• If (u, v) is the last edge of the simple path from the root to
vertex v, u is said to be the parent of v and v is called a child of
u.
• Vertices that have the same parent are called siblings.
Leaves
• A vertex without children is called a leaf.
Subtree
• A vertex v with all its descendants is called the subtree of T
rooted at v.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Rooted Trees (II)
Depth of a vertex
• The length of the simple path from the root to the vertex.
Height of a tree
• The length of the longest simple path from the root to a leaf.
h=2
3
4 1 5
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Ordered Trees
Ordered trees
• An ordered tree is a rooted tree in which all the children of each vertex
are ordered.
Binary trees
• A binary tree is an ordered tree in which every vertex has no more than
two children and each children is designated s either a left child or a right
child of its parent.
Binary search trees
• Each vertex is assigned a number.
• A number assigned to each parental vertex is larger than all the numbers
in its left subtree and smaller than all the numbers in its right subtree.
9 6
6 8 3 9
5 2 3 2 5 8
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “Introduction to the Design & Analysis of Algorithms,” 2nd ed., Ch. 1