08_Lecture -Chapter 10- Image Segmentation_Part I_Edge Detection
08_Lecture -Chapter 10- Image Segmentation_Part I_Edge Detection
Processing
Chapter 10 – Image Segmentation (Part
I)
•Chapter 10 from R.C. Gonzalez and R.E. Woods, Digital Image Processing
(3rd Edition), Prentice Hall, 2008 [ Section 10.1, 10.2 (excluding 10.2.7) ]
What does Image Segmentation do?
❑ Divide the image into different regions.
❑ Separate objects from background and give them
individual ID numbers (labels).
❑ Purpose is to partition an image into meaningful regions
with respect to a particular application.
For example, it allows us to:
• Count the number of objects of a certain type.
• Measure geometric properties (e.g., area, perimeter) of
objects in the image.
• Study properties of an individual object (intensity,
texture, etc.)
Principle Approaches of
Segmentation
Segmentation Algorithms are based on one of the two
basic properties of intensity values:
❑ Similarity
⮚Partitioning an image into regions that are similar
according to a set of predefined criteria.
❑Discontinuity
⮚Partitioning an image based on sharp changes in
intensity (such as edges in an image).
Types of Segmentation Algorithms
❑Similarity
⮚ Thresholding – based on pixel intensities
⮚ Region based – grouping similar pixels
⮚ Match based – comparison to a given template
❑Discontinuity
⮚ Edge based – detection of edges that separate regions
-1 8 -1
-1 -1 -1
R1 R2 R3 R4
5 5 4 3 2 1 0 0 0 6 0 0 0 0 1 3 1 0 0 0 0 7 7 7 7
1st Derivative -1 -1 -1 -1 -1 0 0 6 -6 0 0 0 1 2 -2 -1 0 0 0 7 0 0 0
Zero crossing
Characteristics of First and
Second Order Derivatives
▪ 2ND DERIVATIVE PASSES THROUGH ZERO (CHANGES ITS SIGN) AT THE CENTER
OF THE EDGE SIGNAL.
Zero-crossing Feature
1. Filtering (Smoothing)
In this stage image is pass through a filter to remove
the noise.
2. Differentiation (Edge sharpening using derivatives)
this stage highlights the location in the image where
intensity changes i.e. detects discontinuities.
3. Detection (Thresholding)
this stage take decision on the edge pixel i.e. where
the changes are significant.
4. Localization
determine the exact location of an edge.
Gradient based Edge
Detection
-1 0 1 -1 -1 -1
-1 0 1 0 0 0
-1 0 1 1 1 1
Gradient – Sobel Operator
-1 0 1 -1 -2 -1
-2 0 2 0 0 0
-1 0 1 1 2 1
Basic Edge Detection by
Sobel
-1 0 1
-2 0 2
-1 0 1
-1 -2 -1
0 0 0
1 2 1
Gradient based Edge
Detection cont.
Edge Detection Example
Original Image Horizontal Gradient Component
f(x,y) fs(x,y)
Canny Edge Detection
(step 2:Algorithm
finding gradient operator)
❑ Compute the derivative of smoothed image fs(x,y)
❑ Calculate the Gradient Magnitude and Direction.
❑ Any of the filter mask pairs can be use to get the derivatives.
gy gx
Result after step 2
fs(x,y) M(x,y)
Canny Edge Detection
(step 3:Algorithm
Non-Max Suppression)
(step 3: Non-Max Suppression)……
▪ Fro each pixel, the neighboring pixels are located in horizontal, vertical,
and diagonal directions (0°, 45°, 90°, and 135°).
▪ Thus we need to round off the gradient direction at every pixel to one of
these directions as shown below.
(step 3: Non-Max Suppression)……
Example:
▪Suppose for a pixel ‘A’, the gradient direction comes out to be 17 degrees.
▪Since 17 is nearer to 0, we will round it to 0 degrees.
▪Then we select neighboring pixels in the rounded gradient direction (See
B and C in below figure).
▪If the magnitude value M(x,y) of A is greater than that of B and C, it is
retained as an edge pixel otherwise suppressed.
(step 3: Non-Max Suppression)……
(step 3: Non-Max Suppression)……
Result after step 3
M(x,y) gN(x,y)
Canny Edge Detection
(step 4:Algorithm
Hysteresis Thresholding)
0 0 101
0 45 0
0 0 0
50 0 0
gnH(x,y) gnL(x,y)
200 0 57
0 45 101
0 0 0
g(x,y)
Final Result after step 5
gN(x,y) g(x,y)
Canny Edge Detection: Summary