0% found this document useful (0 votes)
14 views31 pages

Session 7 and 8

Uploaded by

Schoolboy LT
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views31 pages

Session 7 and 8

Uploaded by

Schoolboy LT
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 31

DCIT 204

Data Structures and


Algorithms 1

Sessions 7 - Greedy Technique/Algorithms

Course Writer: Dr Kofi Sarpong Adu-Manu


Contact Information: [email protected]

College of Education
School of Continuing and Distance Education
2020/2021 – 2022/2023
Introduction
• Algorithms for optimization problems typically go through a
sequence of steps, with a set of choices at each step.
• A greedy algorithm always makes the choice that looks best at the
moment (it makes a locally optimal choice in the hope that this
choice will lead to a globally optimal solution).
• Greedy algorithms do not always yield optimal solutions, but for
many problems they do.
• The greedy method is quite powerful and works well for a wide
range of problems
Greedy Technique
Constructs a solution to an optimization problem piece by
piece through a sequence of choices that are:

• feasible

• locally optimal

• irrevocable

For some problems, yields an optimal solution for every instance.


For most, does not but can be useful for fast approximations.
Applications of the Greedy Strategy
• Optimal solutions:
– change making for “normal” coin denominations
– minimum spanning tree (MST)
– single-source shortest paths
– simple scheduling problems
– Huffman codes

• Approximations:
– traveling salesman problem (TSP)
– knapsack problem
– other combinatorial optimization problems
Change-Making Problem
Given unlimited amounts of coins of denominations d1 > … > dm ,
give change for amount n with the least number of coins

Example: d1 = 25c, d2 =10c, d3 = 5c, d4 = 1c and n = 48c

Greedy solution:

Greedy solution is
• optimal for any amount and “normal’’ set of denominations
• may not be optimal for arbitrary coin denominations
Minimum Spanning Tree (MST)
• A spanning tree of an undirected connected graph is its connected
acyclic subgraph (i.e., a tree) that contains all the vertices of the
graph.
• If such a graph has weights assigned to its edges, a minimum
spanning tree is its spanning tree of the smallest weight, where the
weight of a tree is defined as the sum of the weights on all its
edges. The minimum spanning tree problem is the problem of
finding a minimum spanning tree for a given weighted connected
graph. 6 c
a
4 1
Example: 2

b d
3
Graphs and its spanning tree

Graph and its spanning trees, with T1 being


the minimum spanning tree
Prim’s MST algorithm
• Start with tree T1 consisting of one (any) vertex and “grow” tree
one vertex at a time to produce MST through a series of
expanding subtrees T1, T2, …, Tn

• On each iteration, construct Ti+1 from Ti by adding vertex not in


Ti that is closest to those already in Ti (this is a “greedy” step!)

• Stop when all vertices are included


Algorithm [Prim’s]
Procedure: Prim’s Algorithm
1. Randomly choose any vertex. The vertex connecting to the edge
having least weight is usually selected and is added to the partially
constructed spanning tree.
2. Find all the edges that connect the tree to new vertices.
3. Find the least weight edge among those edges and include it in to
the added to the partially constructed spanning tree.
4. If including that edge creates a cycle, then reject that edge and
look for the next least cost edge.
5. Keep repeating step-02 until all the vertices are included and
Minimum Spanning Tree (MST) is obtained.
Implementation
Implementation
Example - TRY

4
c
a
1
6
2

d
b 3
Notes about Prim’s algorithm
• Proof by induction that this construction actually yields MST

• Needs priority queue for locating closest fringe vertex

• Efficiency
– O(n2) for weight matrix representation of graph and array
implementation of priority queue
– O(m log n) for adjacency list representation of graph with n
vertices and m edges and min-heap implementation of priority
queue
Kruskal's Algorithm
• Kruskal's algorithm is a greedy algorithm in graph theory that finds
a minimum spanning tree for a connected weighted graph.
• It finds a subset of the edges that forms a tree that includes every
vertex, where the total weight of all the edges in the tree is
minimized.
• This algorithm is directly based on the MST( minimum spanning
tree) property.
Kruskal's Algorithm
• MST-KRUSKAL(G, w)
1. A ← Ø
2. for each vertex v V[G]
3. do MAKE-SET(v)
4. sort the edges of E into nondecreasing order by weight w
5. for each edge (u, v) E, taken in nondecreasing order by weight
6. do if FIND-SET(u) ≠ FIND-SET(v)
7. then A ← A {(u, v)}
8. UNION(u, v)
9. return A
Another greedy algorithm for MST: Kruskal’s

• Sort the edges in nondecreasing order of lengths

• “Grow” tree one edge at a time to produce MST through a series of


expanding forests F1, F2, …, Fn-1

• On each iteration, add the next edge on the sorted list unless this
would create a cycle. (If it would, skip the edge.)
In class - Example
Procedure for finding Minimum
Spanning Tree
Step1. Edges are sorted in ascending order by weight.

Edge No. Vertex Pair Edge Weight


E1 (0,2) 1
E2 (3,5) 2
E3 (0,1) 3
E4 (1,2) 3
E5 (2,5) 4
E6 (1,2) 5
E7 (2,3) 5
E8 (0,3) 6
E9 (2,2) 6
E10 (2,5) 6
KA-Implementation

Graph Add Edge E1 Add Edge E2


KA-Implementation

Add Edge E3
Add Edge E4
Add Edge E5

Add Edge E4 Add Edge E5


Add Edge E3

Total Cost = 1+2+3+3+4 = 13s


Example - TRY

4
c
a
1
6
2

d
b 3
Notes about Kruskal’s algorithm
• Algorithm looks easier than Prim’s but is harder to implement
(checking for cycles!)

• Cycle checking: a cycle is created iff added edge connects vertices


in the same connected component

• Union-find algorithms – see section 9.2


Shortest paths – Dijkstra’s algorithm
Single Source Shortest Paths Problem: Given a weighted
connected graph G, find shortest paths from source vertex s
to each of the other vertices

Dijkstra’s algorithm: Similar to Prim’s MST algorithm, with


a different way of computing numerical labels: Among vertices
not already in the tree, it finds vertex u with the smallest sum
dv + w(v,u)
where
v is a vertex for which shortest path has been already found
on preceding iterations (such vertices form a tree)
dv is the length of the shortest path form source to v
w(v,u) is the length (weight) of edge from v to u
4
b c

Example a
3
7 2
d
d
5 4
6

Tree vertices Remaining vertices b


4
c
3 6
a(-,0) b(a,3) c(-,∞) d(a,7) e(-,∞) a
2 5
d e
7 4

4
b c
b(a,3) c(b,3+4) d(b,3+2) e(-,∞) 3 6
2 5
a d e
7 4

4
d(b,5) c(b,7) e(d,5+4) 3
b c
6
2 5
a d e
7 4
4
c(b,7) e(d,9) 3
b c
6
2 5
a d e
7 4
e(d,9)
Notes on Dijkstra’s algorithm
• Doesn’t work for graphs with negative weights

• Applicable to both undirected and directed graphs

• Efficiency
– O(|V|2) for graphs represented by weight matrix and array
implementation of priority queue
– O(|E|log|V|) for graphs represented by adj. lists and min-heap
implementation of priority queue

• Don’t mix up Dijkstra’s algorithm with Prim’s algorithm!


Coding Problem
Coding: assignment of bit strings to alphabet characters

Codewords: bit strings assigned for characters of alphabet

Two types of codes:


• fixed-length encoding (e.g., ASCII)
• variable-length encoding (e,g., Morse code)

Prefix-free codes: no codeword is a prefix of another codeword

Problem: If frequencies of the character occurrences are


known, what is the best binary prefix-free code?
Huffman codes
• Any binary tree with edges labeled with 0’s and 1’s yields a prefix-
free code of characters assigned to its leaves
• Optimal binary tree minimizing the expected (weighted average)
length of a codeword can be constructed as follows

Huffman’s algorithm
Initialize n one-node trees with alphabet characters and the tree
weights with their frequencies.
Repeat the following step n-1 times: join two binary trees with smallest
weights into one (as left and right subtrees) and make its weight
equal the sum of the weights of the two trees.
Mark edges leading to left and right subtrees with 0’s and 1’s,
respectively.
0.1 0.15 0.2 0.2 0.35

Example B _ C D A

0.2 0.2 0.35


0.25
C D A

character A B C D _ 0.1
B
0.15
_

frequency 0.35 0.1 0.2 0.2 0.15


0.25 0.35 0.4
A
codeword 11 100 00 01 101 0.1 0.15 0.2 0.2
B _ C D

average bits per character: 2.25 0.4 0.6

for fixed-length encoding: 3


0.2 0.2
compression ratio: (3-2.25)/3*100% = 25% 0.25 0.35
C D
A

0.1 0.15
B _

1.0
0 1

0.4 0.6
0 1 0 1

0.2 0.2 0.35


0.25
C D A
0 1

0.1 0.15
B _
Reference
Levitin, A. (2012). Introduction to the Design and
Analysis of Algorithms ( 3rd Edition). Harlow: Addison
Wesley.
Acknowledgement

Pearson Education, Inc. Upper Saddle River, NJ.


All Rights Reserved.

You might also like