0% found this document useful (0 votes)
43 views18 pages

06b Discriminant Analysis

This document is a slide presentation on the topic of discriminant analysis and classification. It introduces discriminant analysis and Fisher's linear discriminant analysis (LDA) criterion for obtaining an optimal projection of data that maximizes separation between classes. It discusses using LDA to perform classification by selecting a threshold to compare projected values. It also covers decision theory principles for classification and the MATLAB classify() function for performing classification via discriminant analysis, with options for modeling class covariances as linear, diagonal linear, quadratic, or diagonal quadratic.

Uploaded by

Wang Chen Yu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views18 pages

06b Discriminant Analysis

This document is a slide presentation on the topic of discriminant analysis and classification. It introduces discriminant analysis and Fisher's linear discriminant analysis (LDA) criterion for obtaining an optimal projection of data that maximizes separation between classes. It discusses using LDA to perform classification by selecting a threshold to compare projected values. It also covers decision theory principles for classification and the MATLAB classify() function for performing classification via discriminant analysis, with options for modeling class covariances as linear, diagonal linear, quadratic, or diagonal quadratic.

Uploaded by

Wang Chen Yu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Classification

Discriminant
sc a Analysis
a ys s

slides thanks to Greg Shakhnarovich (CS195-5, Brown Univ., 2006)

Jeff Howbert Introduction to Machine Learning Winter 2012 1


Jeff Howbert Introduction to Machine Learning Winter 2012 2
Jeff Howbert Introduction to Machine Learning Winter 2012 3
Jeff Howbert Introduction to Machine Learning Winter 2012 4
Jeff Howbert Introduction to Machine Learning Winter 2012 5
Jeff Howbert Introduction to Machine Learning Winter 2012 6
Jeff Howbert Introduction to Machine Learning Winter 2012 7
Jeff Howbert Introduction to Machine Learning Winter 2012 8
Jeff Howbert Introduction to Machine Learning Winter 2012 9
Jeff Howbert Introduction to Machine Learning Winter 2012 10
Jeff Howbert Introduction to Machine Learning Winter 2012 11
Example of applying Fishers LDA

maximize separation of means maximize Fishers LDA criterion


better class separation

Jeff Howbert Introduction to Machine Learning Winter 2012 12


Using LDA for classification in one dimension

z Fishers LDA gives an optimal choice of w, the vector for


projection down to one dimension.
z For classification, we still need to select a threshold to
compare projected values to. Two possibilities:
No
N explicit
li it probabilistic
b bili ti assumptions.
ti Fi
Find
d threshold
th h ld
which minimizes empirical classification error.
Make assumptions about data distributions of the
classes, and derive theoretically optimal decision
boundary.
U
Usual
l choice
h i ffor class
l di
distributions
t ib ti iis multivariate
lti i t G Gaussian.
i
We also will need a bit of decision theory.

Jeff Howbert Introduction to Machine Learning Winter 2012 13


Decision theory

To minimize classification error:

At a given point x in feature space,


y = arg max p (C | x) choose as the predicted class the class
C
that has the greatest probability at x.

Jeff Howbert Introduction to Machine Learning Winter 2012 14


Decision theory

At a given point x in feature space,


y = arg max p (C | x) choose as the predicted class the class
C
that has the greatest probability at x.

probability densities for classes C1 and C2 relative probabilities for classes C1 and C2

Jeff Howbert Introduction to Machine Learning Winter 2012 15


MATLAB interlude

Classification via discriminant analysis,


using the classify() function.
Data for each class modeled as multivariate Gaussian.

matlab_demo_06.m

class = classify( sample, training, group, type


type )

predicted test test data training data training model for class
labels labels covariances

Jeff Howbert Introduction to Machine Learning Winter 2012 16


MATLAB classify() function

Models for class covariances

linear: diaglinear:
all classes have same covariance matrix all classes have same diagonal covariance matrix
linear decision boundary linear decision boundary

quadratic: diagquadratic:
classes
l h
have diff
differentt covariance
i matrices
ti classes
l h
have diff
differentt di
diagonall covariance
i matrices
ti
quadratic decision boundary quadratic decision boundary

Jeff Howbert Introduction to Machine Learning Winter 2012 17


MATLAB classify() function

Example with quadratic model of class covariances

Jeff Howbert Introduction to Machine Learning Winter 2012 18

You might also like