Lec - PR Own
Lec - PR Own
4. Mathematical morphology
What is pattern recognition?
What is pattern recognition? the act of taking in raw image
and taking an action based on the “category” of patterns
shown in the data
• Patterns are elements or formation happened in
repeatedly manner
• Example: sorting
incomingFish on a
conveyor according to
species using optical
sensing”
How?
1. Set up a camera
2. Take some sample images
3. Process the images
4. Extract features
– Length
– Lightness
– Width
– Number and shape of fins
– Position of the mouth, etc…
Pattern Classification System
• Preprocessing
– Image processing
– Segment (isolate) interested objects (fishes) from one another
and from the background
• Feature Extraction
– Reduce the data by measuring certain features
• Classification
– Divide the feature space into decision region
– Pattern representation and description
How to do Classification
• Example: use the length of the fish as a possible
feature for discrimination
• The length is a poor feature alone!
• Select the lightness as a possible feature
Feature Selection
• The length is a poor feature alone!
• Select the lightness as a possible feature
Threshold decision
• Move decision boundary toward smaller values of
lightness in order to minimize the cost (reduce the
number of sea bass that are classified salmon!)
1. Data collection
2. Feature Choice
3. Model Choice
4. Training
5. Evaluation
6. Computational
Complexity
Data Collection
• Data Collection
– How do we know when we have collected an adequately large and
representative set of examples for training and testing the system?
• Choice of Features
– Depends on the characteristics of the problem domain
– Simple to extract, invariant to irrelevant transformations, insensitive to noise
• Model Choice
– Unsatisfied with the performance of our fish classifier and want to jump to
another class of model
• Training
– Use data to determine the classifier
– (Many different procedures for training classifiers and choosing models)
• Evaluation
– Measure the error rate (or performance)
– Possibly switch from one set of features to another one
• Computational Complexity
– What is the trade-off between computational ease and performance?
– How does an algorithm scale as a function of the number of features,
patterns, or categories?
Learning and Adaptation
• Supervised learning
– A teacher provides a category label for each pattern in the
training set
• Unsupervised learning
– The system forms clusters or “natural groupings” of the
unlabeled input patterns
Baysian Decision Theory
• Fundamental statistical approach
• Assumes relevant probabilities are known, compute the probability
of the event observed, then make optimal decisions
P(B | A)P( A)
• Bayes’ Theorem: P( A | B)
P(B)
• Example:
Suppose at Laurier, 50% are girl students, 30% are science students,
among science students, 20% are girl students. If onemeet a girl
student at Laurier, what is the probability that she is ascience student.
B – girl students, A – science students. Then
P( A) 30% P(B)
50% P(B | A)
20%
P(B | A)P( A) 0.2 0.3 0.06
P( A | B) 0.12
P(B) 0.5 0.5
Unsupervised Learning
• Often called clustering. The system is not given a set of
labeled patterns for training. Instead the system
establishes the classes itself based on the regularities of
the patterns
• Clustering Separate Clouds
– Methods work fine when clusters form well separated compact
clouds
Less well when there are great differences in the number of
samples in different clusters
Hierarchical Clustering
•Sometimes clusters are not disjoint, but may have
subclusters, which in turn having sub-subclusters, etc.
•Consider partitioning n samples into clusters
•Start with n cluster, each one containing exactly one sample
•Then partition into n-1 clusters, then into n-2, etc.
Image pattern recognition
• To identify objects represented by images
• Mathematical morphology
( A B)c Ac ∘ B̂
( A ∘ B) A B̂
c c
A∘ B A
if C D, then C ∘ B D ∘ B
( A ∘ B) ∘ B A ∘ B
A B A
if C D, then C B D B
( A B) B A B
Construct filters by morphological operation
Hit or miss transformation
• A basic tool for shape detection
A B ( AB) [ Ac(W D)]
Basic morphological algorithm
• Extracting boundaries, connected components, the
convex hull, and skeleton of a region
• Boundary ( A) A ( AB)
extraction:
Hole filling
• Let X 0 be all 0’s. Continue the following computing
X k ( X k 1 B) Ac , k 1, 2, 3,...
until X k X k 1
Extraction of connected components
• Let X 0 be all 0’s. Continue the following computing
X k ( X k 1 B) A, k 1, 2, 3,...
until X k X k 1
Convex hull
• Let X 0 be all 0’s. Continue the following computing
X A, i 1, 2, 3, 4
i
0
1, 2, 3, 4, k 1, 2, 3...
Xki ( X k 1 Bi ) ∩ A, i
until X i X i , then let Di Xi
k k 1 k
4
C( A)
∪
i1
Di
Thinning and Thickening
• The thinning of A by SE B
A B A ( A B) A ∩ ( A B)c
• Thinning by a sequence of SE
{B} {B1, B 2 , B 3,..., B n }
A {B} ((..(( A B1) B2 )...) Bn )
Thickening
• The thickening of A by SE B
A □ B A ( A B)
• Thickening by a sequence of SE
{B} {B1, B2 , B3,..., B n }
A □{B} ((..(( A □ B1) □ B2 )...) □ Bn )
Skeletons
K
S ( A)
∪ S ( A)
k 0
k
Sk ( A) ( AkB) ( AkB) ∘ B
( AkB) ((...(( AB)B)...)B)
K max{k | ( AkB) }
Pruning
• Pruning is done by the following four steps
X1 A {B}
8
X2
∪
k 1
( X1 * B k )
X 3 ( X 2 H ) ∩ A
X 4 X1 ∪ X 3
Morphological Reconstruction
• Geodesic dilation
D(1)
G
(F ) (F B) ∩ G
D (F ) D(1) (F )[D(n1) (F )]
(n)
G G G
• Geodesic erosion
E(1) (F ) (FB) G
G
E (n) (F ) E(1) (F )[E(n1) (F )]
G G G
• Opening f ∘ b ( f b) b
• Closing f b ( f b)b