Digital Image
Processing
Segmentation-7
Instructor Name
Dr. Muhammad Sharif
Material Reference
Images and Material
From
Rafael C. Gonzalez and Richard E. Wood,
Digital Image Processing, 2nd Edition.& Internet Resources
2
Contents
Recap
Example of K-Means Clustering
Hierarchical Clustering
3
K-Means: Example 2
Q: Suppose there are eight points and we need to
make three clusters out of them:
A1(2, 10) A5(7, 5)
A2(2, 5) A6(6, 4)
A3(8, 4) A7(1, 2)
A4(5, 8) A8(4, 9)
Let us choose three clusters randomly
A1(2, 10) A4(5, 8) A7(1, 2)
The distance function is given as:
ρ(a, b) = |x2 – x1| + |y2 – y1|
4
Example 2: Cont…
We need to calculate the distance of all points to each of the
three centroids and assign each point to nearest centroid to
make a cluster.
(2,10) (5,8) (1,2)
Points Dist. Mean 1 Dist. Mean 2 Dist. Mean 3 Clusters
A1 (2, 10)
A2 (2, 5)
A3 (8, 4)
A4 (5, 8)
A5 (7, 5)
A6 (6, 4)
A7 (1, 2)
A8 (4, 9) 5
Example 2: Cont…
For A1:
Point (x1,y1) Mean1 (x2,y2) Mean2 (x2,y2) Mean3 (x2,y2)
A1(2,10) (2,10) (5,8) (1,2)
Using distance function
ρ(a, b) = |x2 – x1| + |y2 – y1|
. A1-Mean 1 A1-Mean 2 A1-Mean 3
|x2 – x1| + |y2 – y1| |x2 – x1| + |y2 – y1| |x2 – x1| + |y2 – y1|
|2 – 2| + |10 – 10| |5 – 2| + |8 – 10| |1 – 2| + |2 – 10|
0 5 9
6
Example 2: Cont…
For A2:
Point (x1,y1) Mean1 (x2,y2) Mean2 (x2,y2) Mean3 (x2,y2)
A2(2,5) (2,10) (5,8) (1,2)
Using distance function
ρ(a, b) = |x2 – x1| + |y2 – y1|
• . A1-Mean 1 A1-Mean 2 A1-Mean 3
|x2 – x1| + |y2 – y1| |x2 – x1| + |y2 – y1| |x2 – x1| + |y2 – y1|
|2 – 2| + |10 – 5| |5 – 2| + |8 – 5| |1 – 2| + |2 – 5|
5 6 4
7
Example 2: Cont…
Similarly find the distance of A4, A5, A6, A7 and A8 to all
three centroids and keep assigning them to nearest centroid
cluster to make a cluster. The table will become:
(2,10) (5,8) (1,2)
Points Dist. Mean 1 Dist. Mean 2 Dist. Mean 3 Clusters
A1 (2, 10) 0 5 9 1
A2 (2, 5) 5 6 4 3
A3 (8, 4) 12 7 9 2
A4 (5, 8) 5 0 10 2
A5 (7, 5) 10 5 9 2
A6 (6, 4) 10 5 7 2
A7 (1, 2) 9 10 0 3
A8 (4, 9) 3 2 10 2 8
Example 2: Cont…
Now we get three clusters at three centroids:
Cluster 1 Cluster 2 Cluster 3
(2, 10) (8, 4) (2, 5)
(5, 8) (1, 2)
(7, 5)
(6, 4)
(4, 9)
Cluster 1: Only one point so centroid remains the
same.
Cluster 2: ( (8+5+7+6+4)/5, (4+8+5+4+9)/5 ) = (6, 6)
Cluster 3: ( (2+1)/2, (5+2)/2 ) = (1.5, 3.5) 9
Example 2: Mapping
3 initial centroids in
8 Selected Data Points
yellow color 10
Example 2: Mapping
Clusters formed on 3 Clusters shown in red x
centroids
11
Example 2: Cont…
Iteration 1 is completed.
Now moving onto Iteration 2 using newly computed
centroids. (2,10) (6,6) (1.5,3.5)
Points Dist. Mean 1 Dist. Mean 2 Dist. Mean 3 Clusters
A1 (2, 10)
A2 (2, 5)
A3 (8, 4)
A4 (5, 8)
A5 (7, 5)
A6 (6, 4)
A7 (1, 2)
A8 (4, 9) 12
Example 2: Cont…
For A1:
Point (x1,y1) Mean1 (x2,y2) Mean2 (x2,y2) Mean3 (x2,y2)
A1(2,10) (2,10) (6,6) (1.5,3.5)
Using distance function
ρ(a, b) = |x2 – x1| + |y2 – y1|
. A1-Mean 1 A1-Mean 2 A1-Mean 3
|x2 – x1| + |y2 – y1| |x2 – x1| + |y2 – y1| |x2 – x1| + |y2 – y1|
|2 – 2| + |10 – 10| |6 – 2| + |6 – 10| |1.5 – 2| + |3.5 – 10|
0 8 10
13
Example 2: Cont…
For A2:
Point (x1,y1) Mean1 (x2,y2) Mean2 (x2,y2) Mean3 (x2,y2)
A1(2,5) (2,10) (6,6) (1.5,3.5)
Using distance function
ρ(a, b) = |x2 – x1| + |y2 – y1|
. A1-Mean 1 A1-Mean 2 A1-Mean 3
|x2 – x1| + |y2 – y1| |x2 – x1| + |y2 – y1| |x2 – x1| + |y2 – y1|
|2 – 2| + |10 – 5| |6 – 2| + |6 – 5| |1.5 – 2| + |3.5 – 5|
5 5 1
14
Example 2: Cont…
Similarly find the distance of A3, A4, A5, A6, A7 and A8 to
all three centroids and keep assigning them to nearest centroid
cluster to make a cluster. The table will become:
(2,10) (6,6) (1.5,3.5)
Points Dist. Mean 1 Dist. Mean 2 Dist. Mean 3 Clusters
A1 (2, 10) 0 8 10 1
A2 (2, 5) 5 5 1 3
A3 (8, 4) 12 4 7 2
A4 (5, 8) 5 3 8 2
A5 (7, 5) 10 2 7 2
A6 (6, 4) 10 2 5 2
A7 (1, 2) 9 9 2 3
A8 (4, 9) 3 5 8 1 15
Example 2: Cont…
Now we get three clusters at three centroids:
Cluster 1 Cluster 2 Cluster 3
(2, 10) (8, 4) (2, 5)
(4, 9) (5, 8) (1, 2)
(7, 5)
(6, 4)
Cluster 1: ((2+4)/2, (10+9)/2) = (3, 9.5)
Cluster 2: ( (8+5+7+6)/4, (4+8+5+4)/4 ) = (6.5, 5.25)
Cluster 3: ( (2+1)/2, (5+2)/2 ) = (1.5, 3.5)
16
Example 2: Mapping
New clusters formation after 2nd
iteration 17
Example 2: Cont…
Iteration 2 is completed.
Now moving onto Iteration 3 using newly computed
centroids. (3,9.5) (6.5,5.25) (1.5,3.5)
Points Dist. Mean 1 Dist. Mean 2 Dist. Mean 3 Clusters
A1 (2, 10)
A2 (2, 5)
A3 (8, 4)
A4 (5, 8)
A5 (7, 5)
A6 (6, 4)
A7 (1, 2)
A8 (4, 9) 18
Example 2: Cont…
For A1:
Point (x1,y1) Mean1 (x2,y2) Mean2 (x2,y2) Mean3 (x2,y2)
A1(2,10) (3,9.5) (6.5,5.25) (1.5,3.5)
Using distance function
ρ(a, b) = |x2 – x1| + |y2 – y1|
. A1-Mean 1 A1-Mean 2 A1-Mean 3
|x2 – x1| + |y2 – y1| |x2 – x1| + |y2 – y1| |x2 – x1| + |y2 – y1|
|3 – 2| + |9.5 – 10| |6.5 – 2| + |5.25 – 10| |1.5 – 2| + |3.5 – 10|
1.5 9.25 7
19
Example 2: Cont...
For A2:
Point (x1,y1) Mean1 (x2,y2) Mean2 (x2,y2) Mean3 (x2,y2)
A1(2,5) (3,9.5) (6.5,5.25) (1.5,3.5)
Using distance function
ρ(a, b) = |x2 – x1| + |y2 – y1|
. A1-Mean 1 A1-Mean 2 A1-Mean 3
|x2 – x1| + |y2 – y1| |x2 – x1| + |y2 – y1| |x2 – x1| + |y2 – y1|
|3 – 2| + |9.5 – 5| |6.5 – 2| + |5.25 – 5| |1.5 – 2| + |3.5 – 5|
5.5 4.75 2
20
Example 2: Cont...
Similarly find the distance of A3, A4, A5, A6, A7 and A8 to
all three centroids and keep assigning them to nearest centroid
cluster to make a cluster. The table will become:
(3,9.5) (6.5,5.25) (1.5,3.5)
Points Dist. Mean 1 Dist. Mean 2 Dist. Mean 3 Clusters
A1 (2, 10) 1.5 9.25 7 1
A2 (2, 5) 5.5 4.75 2 3
A3 (8, 4) 10.5 2.75 7 2
A4 (5, 8) 3.5 4.25 8 1
A5 (7, 5) 8.5 0.75 7 2
A6 (6, 4) 8.5 1.75 5 2
A7 (1, 2) 9.5 8.75 2 3
A8 (4, 9) 1.5 6.25 8 1 21
Example 2: Cont...
Now again, we get three clusters at three centroids:
Cluster 1 Cluster 2
Cluster 3
(2, 10) (8, 4) (2, 5)
(5, 8) (7, 5)
(1, 2)
(4, 9) (6, 4)
Next, we need to compute the new means (centroids)
which is done by taking mean of all points in each cluster.
Cluster 1: ((2+5+4)/3, (10+8+9)/3) = (3.66, 9)
Cluster 2: ( (8+7+6)/3, (4+5+4)/3 ) = (7, 4.33)
Cluster 3: ( (2+1)/2, (5+2)/2 ) = (1.5, 3.5) 22
Example 2: Mapping
Updated cluster formation after 3rd iteration
23
Example 2: Cont…
Now if we take the newly computed centroids and compute
the values of this table again. It will remain same without any
further change and our cluster would become stable.
(3.66,9) (7,4.33) (1.5,3.5)
Points Dist. Mean 1 Dist. Mean 2 Dist. Mean 3 Clusters
A1 (2, 10) 1.5 9.25 7 1
A2 (2, 5) 5.5 4.75 2 3
A3 (8, 4) 10.5 2.75 7 2
A4 (5, 8) 3.5 4.25 8 1
A5 (7, 5) 8.5 0.75 7 2
A6 (6, 4) 8.5 1.75 5 2
A7 (1, 2) 9.5 8.75 2 3
A8 (4, 9) 1.5 6.25 8 1 24
Example 2: Summary
This cycle will stop after 3 iterations as the values
of clusters will not update further.
The final cluster formation will have 3 clusters at 3
designated centroids.
This is how the k-means clustering works on
unsupervised data using continuous iterations
(epochs).
25
Results Summary
Initial Data with 3 Final Cluster formation after 3rd
Centroids iteration
26
Hierarchical Clustering
Hierarchical clustering, also known as
hierarchical cluster analysis, is an algorithm that
groups similar objects into groups called clusters.
The endpoint is a set of clusters, where each
cluster is distinct from each other cluster, and the
objects within each cluster are broadly similar to
each other.
The output of hierarchical clustering is a
dendrogram which show the hierarchical
relationship between the clusters.
27
Hierarchical Clustering: Steps
Hierarchical clustering starts by treating each observation
as a separate cluster.
Then, it repeatedly executes the following two steps:
1. Identify the two clusters that are closest together, and
2. Merge the two most similar clusters.
This iterative process continues until all the clusters are
merged together
28
Hierarchical Clustering:
Example
29
Dendrogram Formation
Dendrogram
30
Limitations and Advantages
The dendrogram formations makes it really easy to
just understand the relationships at one glance.
The results are in high correlation with the original
data characteristics.
The focus is on calculating the distances between
each cluster rather than centroid.
There is a lot of detail involved so it can be time
complex.
31
K-Means vs Hierarchical Clustering
K-Means:
Low memory usage
Essentially O(n)compute time
Results are sensitive to random initialization
Number of clusters is pre-defined
32
K-Means vs Hierarchical Clustering…
Hierarchical Clustering:
Deterministic algorithm
Dendrogram shows us clusterings for various choices
of K
Requires only a distance matrix, quantifying how
dissimilar observations are from one another
We can use a dissimilarity measure that gracefully
handles categorical variables, missing values, etc
Memory-heavy, more computationally intensive than
K-means
33
Summary
Clustering
K Means Clustering
Examples
34
Next Lecture
Region Based Segmentation
Region Growing Segmentation
Region Splitting
Region Merging
35
Slide Credits and References
Wilhelm Burger and Mark J. Burge, Digital Image Processing, Springer, 2008
University of Utah, CS 4640: Image Processing Basics, Spring 2012
Rutgers University, CS 334, Introduction to Imaging and Multimedia, Fall 2012
https://www.slideshare.net/VikasGupta24/image-segmentation-66118502
https://www.slideshare.net/tawosetimothy/image-segmentation-34430371?
next_slideshow=1
https://www.ques10.com/p/34966/explain-image-seg
https://en.wikipedia.org/wiki/Image_segmentation
https://www.slideshare.net/guest49d49/segmentation-presentation
http://www.labbookpages.co.uk/software/imgProc/otsuThreshold.html
https://www.slideshare.net/guest49d49/segmentation-presentation
https://webdocs.cs.ualberta.ca/~zaiane/courses/cmput695/F07/exercises/Exercises695C
lus-solution.pdf
http://disp.ee.ntu.edu.tw/meeting/%E6%98%B1%E7%BF%94/Segmentation%20tutori
al.pdf
https://www.youtube.com/watch?v=4b5d3muPQmA
https://www.analyticsvidhya.com/blog/2019/04/introduction-image-segmentation-tech
niques-python/
http://people.csail.mit.edu/dsontag/courses/ml12/slides/lecture14.pdf
36
THANK YOU
37