0% found this document useful (0 votes)
486 views14 pages

LDA Two Classes - Example: Compute The Linear Discriminant Projection For The Following Two-Dimensional Dataset

Machine learning research using using Linear discrimination technique for Feature extraction

Uploaded by

Abdu.aljammal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
486 views14 pages

LDA Two Classes - Example: Compute The Linear Discriminant Projection For The Following Two-Dimensional Dataset

Machine learning research using using Linear discrimination technique for Feature extraction

Uploaded by

Abdu.aljammal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

LDA … Two Classes - Example

• Compute the Linear Discriminant projection for the following two-


dimensional dataset.
– Samples for class ω1 : X 1 = (x1,x2)={(4,2),(2,4),(2,3),(3,6),(4,4)}

– Sample for class ω 2 :X 2 = (x1,x2)={(9,10),(6,8),(9,5),(8,7),(10,8)}


10

6
x2

1
00
1 2 3 4 5 6 7 8 9 10
x1
LDA … Two Classes - Example
• The classes mean are :

1 1  4   2   2   3   4    3 
 =
1 
N1 x 1
x =   +   +   +   +   =  
5  2   4   3   6   4    3.8 

 = 1 1  9   6   9   8  10    8.4 
2 
N 2 x 2
x =   +   +   +   +   =  
5 10   8   5   7   8    7.6 
LDA … Two Classes - Example
• Covariance matrix of the first class:

 4   3   2   3 
2 2

S 1 S=1𝒏=− 𝟏𝒏
𝟏
𝟏
−𝟏
(x −(1x)(−x−1)(1x)−=𝒏1𝟏)− 𝟏=  2  −   +  4  −  
T 𝟏
𝟏
x 1 x 1   3.8    3.8 
 2   3   3   3   4   3 
2 2 2

+   −   +   −   +   −  
 3   3.8   6   3.8   4   3.8 
 1 − 0.25
= 
 − 0.25 2.2 
LDA … Two Classes - Example
• Covariance matrix of the first class:
𝟐 𝟐
𝟏 −𝟏
 4   3   2   3 
2 2

S 1 S=1𝒏=− 𝟏𝒏
𝟏
𝟏
−𝟏
(x −(1x)(−x−1)(1x)−=𝒏1𝟏)− 𝟏=  2 −𝟏.
T 𝟏 − 𝟖  +  𝟎.−𝟐 
𝟏
x 1 x 1   3.8   4  3.8 
𝟐 𝟐 𝟐

 2   3   3   3   4   3 
−𝟏 𝟎 𝟏
2 2 2

+  −𝟎.
− 𝟖  +  𝟐.−𝟐  +   𝟎.− 𝟐 
 3   3.8   6   3.8   4   3.8 
 1 − 0.25
= 
 − 0.25 2.2 
LDA … Two Classes - Example
• Covariance matrix of the first class:

    3−𝟏 𝟎. 𝟐
2 2
 4𝟏
  3   2  −𝟏

S 1 S=1𝒏=− 𝟏𝒏
𝟏
𝟏
−𝟏
(x 
−  (1
1)(1x)−=1 ) = −𝟏. 𝟖−   + + 𝟎.−𝟐 
x)(−x− T 𝟏 𝟏 −𝟏. 𝟖
𝟏
x 1 x 1
𝒏𝟏 − 𝟏
 2  3.8   4  3.8 
𝟎 𝟏
 2  −𝟏 −𝟎.
3 𝟖  3  𝟎 3𝟐.𝟐  4  𝟏 3𝟎. 𝟐
−𝟏 2 2 2
+ +−𝟎.𝟖  −   + 𝟐. 𝟐 −   + 𝟎. 𝟐 −  
 3   3.8   6   3.8   4   3.8 
 1 − 0.25
= 
 − 0.25 2.2 
LDA … Two Classes - Example
• Covariance matrix of the first class:

−𝟎. 𝟐2
 4 𝟏  −𝟏.
3 𝟖  2   3 
𝟏
2

S1 =
𝟏
 (x −  )(x−  ) =
1 1
T 𝟏
 2−𝟏.
 𝟖 3.8  +  −𝟎.
𝟑. 𝟒𝟐 − 𝟐 𝟎.𝟎𝟒
     4  3.8 
𝒏𝟏 − 𝟏 𝟓−𝟏
x 1

 2𝟏 𝟎. 𝟖3   3𝟎 𝟎 3   4𝟏 𝟎. 𝟐3 


2 2 2

+  𝟎.𝟖 − 𝟎. 𝟔𝟒  +  𝟎 −𝟒.𝟖𝟒  +  𝟎. 𝟐 −𝟎. 𝟎𝟒 


 3   3.8   6   3.8   4   3.8 
 = 1𝟏 𝟒 − 0.25 −𝟏 𝟏 −𝟎. 𝟐𝟓
=  𝟒 −𝟏 𝟖. 𝟖 = −𝟎. 𝟐𝟓 𝟐. 𝟐
 − 0.25 2.2 
LDA … Two Classes - Example
• Covariance matrix of the second class:

 9  8.4  6  8.4
2 2

S2 =
𝟏
 (x −  2 )(x−  2 )T
= 𝟏
10  −   +  8  −  
   7.6     7.6 
𝒏𝟐 − 𝟏 𝒏 −𝟏
𝟐
x 2

 9  8.4  8  8.4 10  8.4


2 2 2

+   −   +   −   +   −  
 5   7.6   7   7.6   8   7.6 
 2.3 − 0.05
= 
 − 0.05 3.3 
LDA … Two Classes - Example
• Covariance matrix of the second class:

 9𝟔  𝟎.8.4
𝟏𝟎.  −𝟐. 8.4
2 2
6  𝟒

S2 =
𝟏
 (x −  2 )(x−  2 )T
= 𝟏
𝟐.10  −
𝟒  
𝟔 𝟐.

𝟒
 +
+ 
 𝟎.
𝟒− 
 
−𝟐. 𝟒 𝟎. 𝟒

   7.6   8   7.6 
𝒏𝟐 − 𝟏 𝒏 −𝟏
𝟐
x 2

 9  8.4  8  8.4 10


𝟏. 𝟔 
8.4
2−𝟎. 𝟒 2 2
𝟎. 𝟔
+ + 𝟎. 𝟔 −−𝟐.
 𝟔 + −𝟎.
+ 𝟔  −   + 𝟎. 𝟒  − 𝟔 𝟎.𝟒
−𝟎. 𝟒 −𝟎. 𝟔 𝟏.
−𝟐. 𝟔
 5   7.6   7   7.6   8   7.6 
 2.3 − 0.05
= 
 − 0.05 3.3 
LDA … Two Classes - Example
• Covariance matrix of the second class:

 𝟎.  8.4  6𝟓.𝟕𝟔 8.4


2 −𝟎. 𝟗𝟔 2
9 𝟑𝟔 𝟏. 𝟒𝟒
S2 =
𝟏
 (x −  2 )(x−  2 )T
= 𝟏
10  − 𝟓. 𝟕𝟔  + −𝟎.𝟗𝟔
𝟏. 𝟒𝟒 −  𝟎. 𝟏𝟔


   7.6   8   7.6 
𝒏𝟐 − 𝟏 𝟓−𝟏
x 2

 9𝟎 8.4  8𝟎.𝟏𝟔 8.4 10  8.4


2 2 2
−𝟏. 𝟓𝟔 𝟎. 𝟐𝟒 𝟐. 𝟓𝟔 𝟎. 𝟔𝟒
+ −𝟏.𝟓𝟔−  𝟔. 𝟕𝟔 +  𝟎.𝟐𝟒− 𝟎. 𝟑𝟔 +  𝟎. 𝟔𝟒  − 𝟎. 𝟏𝟔 
 5   7.6   7   7.6   8   7.6 
= 2.3
𝟏 𝟗. 𝟐− 0.05
−𝟎. 𝟐 𝟐. 𝟑 −𝟎. 𝟎𝟓
=  𝟒 −𝟎. 𝟐 𝟏𝟑. 𝟐 = −𝟎. 𝟎𝟓 𝟑. 𝟑
 − 0.05 3.3 
LDA … Two Classes - Example
• Within-class scatter matrix:

 1 − 0.25   2.3 − 0.05 


S w = S1 + S 2 =  + 
 − 0.25 2.2   − 0.05 3.3 
 3.3 − 0.3 
= 
 − 0.3 5.5 
LDA … Two Classes - Example
• Between-class scatter matrix:

S B = (1 − 2 )(1 − 2 )
T

=  3  −  8.4  3  −  8.4 


T

3.8 7.6  3.8 7.6 


      
 − 5.4 
= (− 5.4 − 3.8 )
 − 3.8 
 29.16 20.52 
=
 20.52 14.44 
LDA … Two Classes - Example
• The LDA projection is then obtained as the solution of the generalized eigen
value problem −1
SW S B w =  w
 SW−1 S B −  I = 0
−1
 3.3 − 0.3  29.16 20.52   1 0
    −    =0
 − 0.3 5.5   20.52 14.44   0 1 
 0.3045 0.0166  29.16 20.52   1 0
   −   =0
 0.0166 0.1827  20.52 14.44   0 1
 9.2213 −  6.489 
 
 4.2339 2.9794 −  
= (9.2213 −  )(2.9794 −  )− 6.489 4.2339 = 0
 2 −12.2007 = 0   ( −12.2007) = 0
 1 = 0, 2 = 12.2007
LDA … Two Classes - Example

• Hence
 9.2213 6.489   w1 
 w1 = 0  
 4.2339 2.9794  1  w2 

and
 9.2213 6.489   w1 
  w2 = 12.2007  
 4.2339 2.9794  2  w2 
Thus;
 − 0.5755   0.9088  = w *
w1 =   and w2 = 
 0.8178   0.4173 
• The optimal projection is the one that given maximum λ= J(w)
LDA … Two Classes - Example
Or directly; 2
−1
2  3.3 − 0.3   3   8.4 
w = S (1 −  2 ) = 
* −1
   −  

 − 0.3 5.5   3.8   7.6 
W

2

 0.3045 0.0166 
 − 5.4 
=
 0.0166 0.1827  − 3.8 

= 
9.219 6.488 

 4.233 2.9788
After Normalization For example:

= 
0.9088  by dividing the result 9.219
at each column by: 9.2192 + 4.2332
 0.4173  = 0.9088
𝑤12 + 𝑤22

You might also like