Fishers Linear Discriminant Function
Fishers Linear Discriminant Function
BANGALORE-560065
DEPARTMENT OF AGRICULTURAL STATISTICS,
APPLIED MATHEMATICS AND COMPUTER SCIENCE
• Here, FLDA uses an X-Y axis to create a new axis by separating them using
a straight line and projecting data onto a new axis.
• Hence, we can maximize the separation between these classes and reduce the
2-D plane into 1-D.
After generating this new axis using the above-mentioned criteria, all the data
points of the classes are plotted on this new axis and are shown in the figure
given below.
Normality:
• Assumption: FLDA assumes that the feature vectors of each class follow a
multivariate normal distribution.
• Implication: If the data significantly deviates from a normal distribution,
the performance of FLDA may degrade.
Linear Separability:
• Assumption: FLDA assumes that the classes are linearly separable in the
transformed feature space.
• Implication: If the classes are not linearly separable, FLDA may not
produce effective discriminant functions, leading to misclassification.
APPLICATIONS:
1. Image Recognition and Classification:
FLDA is used to reduce the dimensionality of image data while
preserving class separability, making it valuable in tasks like face
recognition, object detection, and image classification.
2. Biometric Identification:
FLDA helps in analyzing biometric data such as fingerprints, iris patterns,
and voiceprints.
3. Healthcare and Medical Diagnosis:
FLDA assists in medical diagnosis by analyzing patient data to classify
diseases or predict medical outcomes.
4. Quality Control and Manufacturing:
FLDA is used in quality control processes to classify products based on
various attributes.
5. Financial Fraud Detection:
FLDA is employed in detecting fraudulent transactions or activities in
financial systems.
6. Remote Sensing and Environmental Monitoring:
FLDA aids in classifying remote sensing data for land cover classification,
environmental monitoring, and resource management.
MERITS OF FLD
• Minimizes the variance and maximize the class distance between two
variables.
• Can be use for classification task .
• FLDA reduces dimensionality while preserving most of the class
discrimination.
• It provides a clear decision rule for classification.
• It's relatively simple and computationally efficient.
DEMERITS OF FLD
• When the discriminative information are not in the means of classes.
• Small sample size problem.
• Not suitable for non linear models
• Works with Labelled Data only (Supervised in nature).
• FLDA assumes linearity and Gaussian distribution of data within classes.
• It can be sensitive to outliers and class imbalance.
• FLDA may not perform well when classes are highly overlapping or non-
linearly separable.
CONCLUSION
• FLDA is a powerful technique for dimensionality reduction and
classification, especially when the underlying assumptions are met and the
classes are well-separated.
• It's particularly useful when dealing with high-dimensional data and when
interpretability of the classification model is important.
• FLDA plays a crucial role in extracting discriminative information from
high-dimensional datasets, facilitating accurate classification, and improving
decision-making processes.
EXAMPLE: Sl. No. X Y CLASS
1 4 1 C1
• Consider the data of 2 classes as- 2 2 4 C1
• Class 1 has 5 Samples (C1): [(4,1),(2,4), 3 2 3 C1
(2,3),(3,6),(4,4)] 4 3 6 C1
• Class 2 has 5 Samples (C2): [(9,10), 5 4 4 C1
(6,8), (9,5),(8,7),(10,8)] 6 9 10 C2
Step 1: Compute the Mean (u) of each 7 6 8 C2
class- = (x1 + x2 +..... + x)/n 8 9 5 C2
u₁ = [3, 3.6] 9 8 7 C2
u₂ = [8.4, 7.6] 10 10 8 C2
Contd...
• Step 2: Compute within-class Scatter matrix (S)
• = S1+S2
• where S₁ and S₂ are the covariance matrix of class C1 and C2 respectively.
• S₁= and similarly for S₂ as well.
• S₁ = (1/5)* Σ
Contd...
• Going One by one will leads us to-
1. =
2. =
3.=
4. =
5. =
Contd...
Adding Equation 1, 2, 3,4, 5 we’ll get
and then dividing each element by (1/5) will lead to
So, S₁=
Therefore,
S₁=
Similarly,
=
Contd...
Therefore,
• = S₁+ =
Step 3: Now we need to compute between class Scatter matrix ().
• (
• [(-5.4, -4) * ]
• =
Contd...
• Step 4: Finding the Fisher linear discriminant function using and Mean.
• *(
• =
• = [(2.64*5.28)-(-0.44*-0.44)]
• =13.7456
• =
Contd...