Computer Vision 3 Segmentation 1 Students
Computer Vision 3 Segmentation 1 Students
Image Segmentation
Contents
Cat/dog Classifier
Ex: Image Segmentation
Cat/dog Classifier
Image Segmentation
Object Detection
Why Image Segmentation?
• Object detection builds a bounding box corresponding to each class in the image
• Image segmentation creates a pixel-wise mask for each object in the image
• Segmentation provides finer details of boundary of the objects
Feature 1
K-means clustering
• Recalculate the locations of the centroids
• Coordinate of the centroid is the mean value of all points of the cluster
• Reassign other points to new centroid which is closest
• The recalculation of centroids is repeated until a stopping condition is satisfied
K-means clustering
The recalculation of centroids is repeated until a stopping condition is
satisfied
G
R
Sr No R G B
1 10 200 5
2 29 150 30
3 11 200 4
4 11 200 6
5 50 98 50
6 51 98 50
7 11 199 6
8 9 198 5
9 49 99 50
Ex: K-means clustering
• Choose centroid locations
• Centroid 1: (11,200, 4) and Centroid 2: (51,98, 50)
Iteration 1
Ex: K-means clustering
• Centroid 1: (11,200, 4) and Centroid 2: (51,98, 50)
• Calculate distance of each sample from centroid
Iteration 1
Ex: K-means clustering
• Centroid 1: (11,200, 4) and Centroid 2: (51,98, 50)
• Calculate distance of each sample from centroid
Iteration 1
Ex: K-means clustering
• Centroid 1: (11,200, 4) and Centroid 2: (51,98, 50)
• Calculate distance of each sample from centroid
Iteration 1
Ex: K-means clustering
Sr No R G B Centroid Dist from centroid1 Dist from centroid2
label
Iteration 1
• Assign labels to sample pixels
Ex: K-means clustering
Iteration 2
Ex: K-means clustering
• Centroid 1: (13.5,191.2,9.3) , Centroid 2: (50,98.3,50)
• Calculate distances from updated centroids
Sr No R G B Centroid Dist from centroid1 Dist from centroid2
label
Iteration 2
Ex: K-means clustering
• Centroid 1: (13.5,191.2,9.3) , Centroid 2: (50,98.3,50)
• Calculate distances from updated centroids
Sr No R G B Centroid Dist from centroid1 Dist from centroid2
label
Iteration 2
Ex: K-means clustering
• Centroid 1: (13.5,191.2,9.3) , Centroid 2: (50,98.3,50)
• Calculate distances from updated centroids
Sr No R G B Centroid Dist from centroid1 Dist from centroid2
label
1 10 200 5 1 1
2 29 150 30 1 1
3 11 200 4 1 1
4 11 200 6 1 1
5 50 98 50 2 2
6 51 98 50 2 2
7 11 199 6 1 1
8 9 198 5 1 1
9 49 99 50 2 2
Iteration 1 Iteration 2
Ex: K-means clustering
• Assign different colors to two groups of pixels in image
1 10 200 5 1 1
2 29 150 30 1 1
3 11 200 4 1 1
4 11 200 6 1 1
5 50 98 50 2 2
6 51 98 50 2 2
7 11 199 6 1 1
8 9 198 5 1 1
9 49 99 50 2 2
Iteration 1 Iteration 2
Image Classification using K means clustering
• Samples are pixels
• Features of each pixel are intensities of red, green and blue
• Color of each pixel is dependent on 3 features
Image Classification using K means clustering
• Image has 426 rows, 640 columns and 3 channels
• Has 272640 pixels
• Has unique 172388 colors
Image Classification using K means clustering
• Each centroid has 3 features
• K means allocates one centroid (label) to each pixel
• Each label and corresponding pixel is given a unique color
K= 40 K= 5
K-means clustering
• Some common stopping conditions for k-means clustering are:
• There is no significant change in centroids in next iteration
• Data points remain in the same cluster in next iteration
• Completed set number of iterations
Ex: Color Segmentation with K means Clustering
• Apply K means algorithm to cluster balls with same colors
Ex: Color Segmentation with K means Clustering
• Apply K means algorithm to cluster balls with same colors
• Count number of balls
Ex: Color Segmentation with K means Clustering
• Apply K means algorithm to cluster balls with same colors
• Count number of balls
K-means clustering (optimum value of ‘k’)
• Pros
• Fast unsupervised machine learning algorithm
• Useful for large dataset
• Cons
• Need to choose the value of K
• Converges to a local minimum
• Sensitive to initialization of centroid
• Sensitive to scaling of dataset and images
• Sensitive to outliers
• Only finds “spherical” clusters, does not work if
clusters have a complex geometric shape
Mean Shift Algorithm
Center window at new mean value and Repeat till there is no significant change
recalculate mean of data points within (less than a threshold) in mean value
window
Mean Shift Algorithm
Assign mean value to its cluster Test for other points near window
Assign data points within window to
corresponding cluster
Segmentation using Mean Shift Algorithm
window
Mean Shift Algorithm with weighted mean
Start point
• Weighted mean
Gaussian weighted mean For each data point within window, apply
Gaussian function of standard deviation σ
Optimum Bandwidth
Small Bandwidth
Large Bandwidth
Ex: Mean Shift Algorithm
• 3-bit color image has following pixel values-
(5,4,5),(1,3,3), (1,3,2), (2,3,2), (3,3,3), (3,3,3), (4,5,4), (5,4,4), (5,4,4), (5,4,4), (5,4,4),
(5,4,4)
• Apply mean shift clustering with bandwidth of 2 and flat kernel (uniform window)
Ex: Mean Shift Algorithm
• Consider initial point as (2,3,2) Sr no R G B
• Use window centered at initial point 1 5 4 5
• Select points within window of 2 1 3 2
bandwidth 2 3 1 3 3
• Use Manhatten distance 4 2 3 2
5 3 3 3
6 3 3 3
7 4 5 4
8 5 4 4
9 5 4 4
10 5 4 4
11 5 4 4
12 5 4 4
Ex: Mean Shift Algorithm
• Select points within window 1 Sr no R G B
• Initial mean of window 1= (2,3,2) 1 5 4 5
• Select points within window of 2 1 3 3
bandwidth 2 3 1 3 3
• New mean of window 1=(2,3,2.8) 4 2 3 2
5 3 3 3
6 3 3 3
7 4 5 4
8 5 4 4
9 5 4 4
10 5 4 4
11 5 4 4
12 5 4 4
Ex: Mean Shift Algorithm
• New mean=(2,3,2.8) Sr no R G B
• Mean of points within window1 = 1 5 4 5
(2,3,2.8) 2 1 3 3
• No change in mean of window 1 3 1 3 3
• Choose center for window 2 4 2 3 2
• Choose initial point in remaining 5 3 3 3
points as (5,4,4) 6 3 3 3
7 4 5 4
8 5 4 4
9 5 4 4
10 5 4 4
11 5 4 4
12 5 4 4
Ex: Mean Shift Algorithm
• New mean=(2,3,3) Sr no R G B
• Mean of points within window1 = 1 5 4 5
(2,3,2.8) 2 1 3 3
• No change in mean 3 1 3 3
• Choose center for window2 4 2 3 2
• Choose initial point in remaining 5 3 3 3
points as (5,4,4) 6 3 3 3
• Select points within window 2 7 4 5 4
8 5 4 4
9 5 4 4
10 5 4 4
11 5 4 4
12 5 4 4
Ex: Mean Shift Algorithm
• Choose an initial point in remaining Sr no R G B
points as (5,4,4) 1 5 4 5
• Select points within window 2 2 1 3 3
• New mean of window 2 = (5.6,4.8,4.8) 3 1 3 3
• There is no change in allocation of points 4 2 3 2
to window2 5 3 3 3
• All the points are covered 6 3 3 3
• If not, choose a center for window3 to 7 4 5 4
address remaining points 8 5 4 4
• If any point in window 1 has smaller 9 5 4 4
distance from mean of window2, include 10 5 4 4
it in window 2 11 5 4 4
12 5 4 4
Mean Shift Algorithm
Image
Feature space
(color channels)
Mean Shift Algorithm for color image
Image
Feature space
(color channels)
Mean Shift Algorithm for color images
Object Tracking using Mean Shift Algorithm
Pros:
Finds number of modes depending on the data values
Robust to outliers
Does not assume any prior shape like spherical, elliptical,
etc. on data clusters
Cons:
Clustering depends on window size
Computationally more expensive than K-means
Can identify noisy pixel as clusters
Finds arbitrary number of clusters
Comparison of K Means and Mean Shift Algorithm