0% found this document useful (0 votes)
26 views

02 Training Patterns

Uploaded by

Mostafa Mohamed
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views

02 Training Patterns

Uploaded by

Mostafa Mohamed
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Pattern Classification

02. Training Patterns

AbdElMoniem Bayoumi, PhD

Spring 2023
Acknowledgment
• These slides have been created relying on
lecture notes of Prof. Dr. Amir Atiya
Recap: OCR
• 𝒛𝒊 = sum of black pixels along column 𝒊

𝒛𝒊 𝒛𝒊

𝒊 𝒊
• Feature: z = 𝒎𝒂𝒙𝒊 (𝒛𝒊 )
– z = 3 for ‘A’
– z = 10 for ‘I’

3
Recap: OCR
• Construct a histogram based on the feature
values of the training patterns

𝒛= 2 3 1 4 3

𝒛= 8 10 9 5 7

4
Recap: OCR
• Construct a histogram based on the feature
values of the training patterns
Number of training patterns Number of training patterns
of letter ‘A’ having z = 3 of letter ‘I’ having z = 10

8 11
1 2 3 4 5 6 7 9 10
z

5
Recap: OCR
• Estimate density functions

P(z|’A’) P(z|’I’)

8 11
1 2 3 4 5 6 7 9 10
z

6
Recap: Feature Space
• Multiple features reduce the classification
error
training pattern for ‘A’
y

training pattern for ‘I’

z
2d feature space for features y & z

7
Feature Vector
𝑿𝟏 (𝒎)
𝑿𝟐 (𝒎)
• Let 𝑿 𝒎 =

be the feature vector of
𝑿𝑵 (𝒎)
the mth training pattern

• N is the number of features, i.e., the


dimension of 𝑋 𝑚

• M is the number of the training patterns

8
Example

#1 #2 #3

z=3 z=4 z=9


y=6 y=7 y=3
3 4 9
𝑋 1 = 𝑋 2 = 𝑋 3 =
6 7 3

9
Decision Regions
• Features from patterns from the same class
tend to be similar

• The data points (patterns) of each class


occur in groupings or clusters in the feature
space plot

• Utilize this fact to design the classifier by


detecting the regions where the patterns of
each class are grouped

10
Decision Regions

classify C1
𝑿𝟐
Decision region for C2

C2
C1
Decision region for C1
classify C2

𝑿𝟏

decision boundary
or classification boundary
11
Decision Boundary
• If the decision boundary is linear (a line in
2D, a plane in 3D, or a hyperplane in more
than 3D), then its equation follows:

𝑾𝟎 + 𝑾𝟏 𝑿𝟏 + ⋯ + 𝑾𝑵 𝑿𝑵 = 𝟎

• 𝑾𝟎 and 𝑾𝒊 are the constants that determine the


position of the hyperplane
y = X2

• Example: or
𝑦 = 𝑚𝑥 + 𝑏

𝑊" 𝑥 + 𝑊# 𝑦 + 𝑊$ = 0
x = X1 or
𝑊" 𝑋" + 𝑊# 𝑋# + 𝑊$ = 0 12
Decision Boundary
decision boundary
𝑿𝟐

C2
C1

𝑿𝟏
>𝟎 classification region for C1
𝑾𝟎 + ∑ 𝑵
𝒊&𝟏 𝑾𝒊 𝑿𝒊 (𝒎) ?= 𝟎 on the decision boundary
<𝟎 classification region for C2

13
Decision Boundary
• Let 𝑋 𝑚 be the feature vector from the
training set

• We want to compute:
𝑻 >𝟎 for most 𝑋 𝑚 of C1
𝑾𝟎 + 𝑾 𝑿 𝒎 )
<𝟎 for most 𝑋 𝑚 of C2

𝑾𝟏
𝑾𝟐
• Where 𝑾 = and 𝑾𝑻 𝑿 𝒎 = ∑𝑵
𝒊'𝟏 𝑾𝒊 𝑿𝒊 (𝒎)

𝑾𝑵

14
Types of Problems
• A problem is said to be linearly separable
if there is a hyperplane that can separate
the training data points of class C1 from
those of C2

• Otherwise it is said to be not linearly


separable

Not linearly separable Linearly separable


15
Types of Problems
decision boundary
𝑿𝟐 𝒇 𝑿 =𝟎

𝒇 𝑿 >𝟎 C1
𝒇 𝑿 <𝟎 C2

𝑿𝟏

• For some problems like this above, a non-


linear decision boundary would be more
appropriate (for a non-linear classifier)

16
Types of Problems
decision boundary
𝑿𝟐 𝒇 𝑿 =𝟎

𝒇 𝑿 >𝟎 C1
𝒇 𝑿 <𝟎 C2

𝑿𝟏

• The linear classifier gives 14 errors, while


the non-linear classifier gives zero error à
choose the non-linear classifier

17
Acknowledgment
• These slides have been created relying on
lecture notes of Prof. Dr. Amir Atiya

18

You might also like