0% found this document useful (0 votes)
44 views24 pages

Fishers Linear Discriminant Function

Uploaded by

Nandhini Laxman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views24 pages

Fishers Linear Discriminant Function

Uploaded by

Nandhini Laxman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 24

UNIVERSITY OF AGRICULTURAL SCIENCES,

BANGALORE-560065
DEPARTMENT OF AGRICULTURAL STATISTICS,
APPLIED MATHEMATICS AND COMPUTER SCIENCE

PRESENTATION ON FISHER LINEAR DISCRIMINANT FUNCTION

SUBMITTED TO: SUBMITTED BY:


Mr. Ranganath. H. K. NANDHINI LAXMAN NAIK
Assistant Professor PAMB 2147
Dept. of AS, AM & CS
UAS, GKVK
Bengaluru-65
FISHERS LINEAR DISCRIMINANT FUNCTION

• Fishers Linear Discriminant (FLD), is a statistical dimensionality reduction


principal which is widely popular in Machine Learning.
• Basically, it is a preprocessing step for pattern classification and machine learning
applications.
• It can be used as a supervised learning classifier.
• Fisher’s linear discriminant attempts to find the vector that maximizes the
separation between classes of the projected data.
• It facilitates the modeling of distinctions between groups, effectively separating
two or more classes.
Brief History and Background
• In 1936, Ronald A. Fisher formulated Linear Discriminant first time and
showed some practical uses as a classifier, it was described for a 2-class
problem.
• Linear Discriminant Analysis (LDA) is a generalized form of FLD. Fisher in
his paper used a discriminant function to classify between two plant
species Iris Setosa and Iris Versicolor.
• Later generalized as ‘Multi-class Linear Discriminant Analysis’ or ‘Multiple
Discriminant Analysis’ by C. R. Rao in the year 1948.
Why Fisher Linear Discriminant Analysis
(FLDA)
• Whenever there is a requirement to separate two or more classes having
multiple features efficiently, the Fisher Linear Discriminant Analysis model
is considered the most common technique to solve such classification
problems.
• For e.g., if we have two classes with multiple features and need to separate
them efficiently. When we classify them using a single feature, then it may
show overlapping.
Let's assume we have to classify two different classes having two sets of data
points in a 2-dimensional plane as shown below image:

 However, it is impossible to draw a straight line in a 2-d plane that can


separate these data points efficiently.
 But using linear Discriminant analysis; we can dimensionally reduce the 2-
D plane into the 1-D plane.
 Using this technique, we can also maximize the separability between
multiple classes.
How Fisher Linear Discriminant Analysis (FLDA) works?

• Here, FLDA uses an X-Y axis to create a new axis by separating them using
a straight line and projecting data onto a new axis.
• Hence, we can maximize the separation between these classes and reduce the
2-D plane into 1-D.
After generating this new axis using the above-mentioned criteria, all the data
points of the classes are plotted on this new axis and are shown in the figure
given below.

Two criteria are used by LDA to create a new axis:


1.Maximize the distance between the means of the two classes.
2.Minimize the variation within each class.
PURPOSE:

• Dimensionality Reduction: FLDA aims to reduce the dimensionality of


the feature space while preserving as much class discriminatory
information as possible.

• Classification: Once the dimensionality is reduced, FLDA constructs


linear decision boundaries to classify data points into different classes.

• Optimal Separation: FLDA maximizes the distance between the means of


different classes while minimizing the variance within each class.
Assumptions in FLD:

Normality:
• Assumption: FLDA assumes that the feature vectors of each class follow a
multivariate normal distribution.
• Implication: If the data significantly deviates from a normal distribution,
the performance of FLDA may degrade.

Linear Separability:
• Assumption: FLDA assumes that the classes are linearly separable in the
transformed feature space.
• Implication: If the classes are not linearly separable, FLDA may not
produce effective discriminant functions, leading to misclassification.
APPLICATIONS:
1. Image Recognition and Classification:
FLDA is used to reduce the dimensionality of image data while
preserving class separability, making it valuable in tasks like face
recognition, object detection, and image classification.
2. Biometric Identification:
FLDA helps in analyzing biometric data such as fingerprints, iris patterns,
and voiceprints.
3. Healthcare and Medical Diagnosis:
FLDA assists in medical diagnosis by analyzing patient data to classify
diseases or predict medical outcomes.
4. Quality Control and Manufacturing:
FLDA is used in quality control processes to classify products based on
various attributes.
5. Financial Fraud Detection:
FLDA is employed in detecting fraudulent transactions or activities in
financial systems.
6. Remote Sensing and Environmental Monitoring:
FLDA aids in classifying remote sensing data for land cover classification,
environmental monitoring, and resource management.
MERITS OF FLD

• Minimizes the variance and maximize the class distance between two
variables.
• Can be use for classification task .
• FLDA reduces dimensionality while preserving most of the class
discrimination.
• It provides a clear decision rule for classification.
• It's relatively simple and computationally efficient.
DEMERITS OF FLD
• When the discriminative information are not in the means of classes.
• Small sample size problem.
• Not suitable for non linear models
• Works with Labelled Data only (Supervised in nature).
• FLDA assumes linearity and Gaussian distribution of data within classes.
• It can be sensitive to outliers and class imbalance.
• FLDA may not perform well when classes are highly overlapping or non-
linearly separable.
CONCLUSION
• FLDA is a powerful technique for dimensionality reduction and
classification, especially when the underlying assumptions are met and the
classes are well-separated.
• It's particularly useful when dealing with high-dimensional data and when
interpretability of the classification model is important.
• FLDA plays a crucial role in extracting discriminative information from
high-dimensional datasets, facilitating accurate classification, and improving
decision-making processes.
EXAMPLE: Sl. No. X Y CLASS
1 4 1 C1
• Consider the data of 2 classes as- 2 2 4 C1
• Class 1 has 5 Samples (C1): [(4,1),(2,4), 3 2 3 C1
(2,3),(3,6),(4,4)] 4 3 6 C1
• Class 2 has 5 Samples (C2): [(9,10), 5 4 4 C1
(6,8), (9,5),(8,7),(10,8)] 6 9 10 C2
Step 1: Compute the Mean (u) of each 7 6 8 C2
class- = (x1 + x2 +..... + x)/n 8 9 5 C2
u₁ = [3, 3.6] 9 8 7 C2
u₂ = [8.4, 7.6] 10 10 8 C2
Contd...
• Step 2: Compute within-class Scatter matrix (S)
• = S1+S2
• where S₁ and S₂ are the covariance matrix of class C1 and C2 respectively.
• S₁= and similarly for S₂ as well.
• S₁ = (1/5)* Σ
Contd...
• Going One by one will leads us to-
1. =
2. =
3.=
4. =
5. =
Contd...
Adding Equation 1, 2, 3,4, 5 we’ll get
and then dividing each element by (1/5) will lead to

So, S₁=
Therefore,
S₁=
Similarly,
=
Contd...

Therefore,
• = S₁+ =
Step 3: Now we need to compute between class Scatter matrix ().
• (
• [(-5.4, -4) * ]
• =
Contd...
• Step 4: Finding the Fisher linear discriminant function using and Mean.
• *(
• =
• = [(2.64*5.28)-(-0.44*-0.44)]
• =13.7456
• =
Contd...

Therefore, Fisher linear discriminant function V is ,


• V=*(-5.4,
• V= *(-5.4,
• V=
Step 5: Finally, we can calculate actual 1D data with the help of
Contd...
We know, (C1): [(4,1),(2,4),(2,3),(3,6),(4,4)]
(C2): [(9,10),(6,8), (9,5),(8,7),(10,8)]
Also, V=
Therefore, & are
• =*
• =
• =*
• =
THANK YOU

You might also like