0% found this document useful (0 votes)
85 views7 pages

Neural Network

This document describes performing principal component analysis (PCA) on a dataset with 15 samples across 5 variables (x1, x2, x3, x4, x5). It involves calculating the covariance matrix, finding the eigenvalues and eigenvectors, choosing principal components based on eigenvalues, and deriving a new dataset in the principal component space. Several steps are shown, including calculating covariances between variables, arranging eigenvalues by magnitude, and selecting the component with the highest eigenvalue to form the feature vector.

Uploaded by

Ukesh Shrestha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as XLSX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
85 views7 pages

Neural Network

This document describes performing principal component analysis (PCA) on a dataset with 15 samples across 5 variables (x1, x2, x3, x4, x5). It involves calculating the covariance matrix, finding the eigenvalues and eigenvectors, choosing principal components based on eigenvalues, and deriving a new dataset in the principal component space. Several steps are shown, including calculating covariances between variables, arranging eigenvalues by magnitude, and selecting the component with the highest eigenvalue to form the feature vector.

Uploaded by

Ukesh Shrestha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as XLSX, PDF, TXT or read online on Scribd
You are on page 1/ 7

Principal Component Analysis

Perform PCA with the given set of datas


Samples

x1

1 0.679
2 0.671
3 0.667
4 0.663
5 0.659
6 0.659
7 0.658
8 0.655
9 0.652
10 0.635
11 0.631
12 0.627
13 0.604
14 0.602
15 0.594
sum 9.656
average 0.64373

x2

x3

x4

x5

x1"

x2"

x3"

64
67.7
67.8
67.3
68.6
69.8
48.4
43.3
63.6
54.5
69.7
48.3
63
63.3
62.3
921.6
61.44

97.8
96.2
80
69.1
74
55.6
82.4
84.2
99.5
71
50.7
85
83.1
61
76.6
1166.2
77.7467

74
62
62
61
58
74
78
65
76
74
58
71
62
60
52
987
65.8

1850
1510
2665
4148
2944
3950
10346
19780
1106
6397
4004
6180
1231
2892
1753
70756
4717.07

0.03527
0.02727
0.02327
0.01927
0.01527
0.01527
0.01427
0.01127
0.00827
-0.00873
-0.01273
-0.01673
-0.03973
-0.04173
-0.04973

2.56
6.26
6.36
5.86
7.16
8.36
-13.04
-18.14
2.16
-6.94
8.26
-13.14
1.56
1.86
0.86

20.0533
18.4533
2.2533
-8.6467
-3.7467
-22.1467
4.6533
6.4533
21.7533
-6.7467
-27.0467
7.2533
5.3533
-16.7467
-1.1467

3.33E-06

-3.3E-05

step2
1,2
x1'*x2'
0.0902912
0.1707102
0.1479972
0.1129222
0.1093332
0.1276572
-0.1860808
-0.2044378
0.0178632
0.0605862
-0.1051498
0.2198322
-0.0619788
-0.0776178
-0.0427678
0.37916
0.37916

calculate of covariance matrix


1,3
x1'*x3
0.70728
0.503221
0.052434
-0.166622
-0.057212
-0.33818
0.066403
0.072729
0.1799
0.058899
0.344304
-0.121348
-0.212687
0.69884
0.057025
1.844987
1.844987

1,4
x'1*x4
0.289214
-0.103626
-0.088426
-0.092496
-0.119106
0.125214
0.174094
-0.009016
0.084354
-0.071586
0.099294
-0.086996
0.150974
0.242034
0.686274
1.2802
1.2802

1,5
x'1*x5
-101.1216
-87.4568
-47.75167
-10.96598
-27.07478
-11.71316
80.32483
169.7592
-29.86355
-14.66579
9.077381
-24.47482
138.5016
76.16017
147.4032
266.1383
266.1383

2,3
2,4
2,5
3,4
4,5
x2*x3
x'2*x4
x'2*x5
x'3*x4
51.33645
20.992 -7339.699 164.4371 -23509.97
115.5177
-23.788 -20076.26 -70.12254 12186.87
14.33099
-24.168 -13051.17 -8.56254 7797.866
-50.66966
-28.128 -3334.75 41.50416 2731.536
-26.82637
-55.848 -12695.18 29.22426 13829.95
-185.1464
68.552 -6412.705 -181.6029 -6289.974
-60.67903 -159.088 -73401.25 56.77026 68672.95
-117.0629
14.512 -273241.6 -5.16264 -12050.34
46.98713
22.032 -7799.911 221.8837 -36832.91
46.8221
-56.908 -11658.71 -55.32294 13775.43
-223.4057
-64.428 -5889.958 210.9643 5561.946
-95.30836
-68.328 -19222.9 37.71716 7607.236
8.351148
-5.928 -5438.269 -20.34254 13247.07
-31.14886
-10.788 -3394.63 97.13086 10585.41
-0.986162
-11.868
-2549.1 15.82446 40904.17
-507.888
-383.18
-465506
534.34 118217.2
-507.888
-383.18
-465506
534.34 118217.2

0.027083 0.13178 0.09144 19.0099 -36.278

-27.37 -33250.4 38.1671 8444.09

covariance matrix
0.00072 0.027083 0.131785 0.091443 19.00988
0.00007 0.131785 19.00988
0.027083 74.36686 -36.27771
-27.37 -33250.43 0.1317848 214.5427 -1002.66
0.131785 -36.27771 214.5427 38.16714 -1002.66 19.009876 -1002.66 23437786

0.091443
-27.37 38.16714 63.88571 8444.086
19.00988 -33250.43 -1002.66 8444.086 23437786

step 4

calculation of eigen value and eigen vector


2.34E+07
0
0 -2.74E-05
0
0

0
0
215

-8.38E-07
1 -6.18E-04
-4.28E-05 -6.18E-04
-1
1 -8.38E-07 -4.28E-05

step 5

eigen value

eigen vector

Choosing components and forming a feature vector


arranging the value according to eigen value
-8.38E-07
1 -6.18E-04 first one corresponding to 2.34E07
1 -8.38E-07 -4.28E-05 second one
-4.28E-05 -6.18E-04
-1 third one
both the eigen value are very small so we use only significant one that is the first one and so feature vector is

-8.38E-07
-4.28E-05
1

feature vector

Step 6: Deriving the new data set


feature vector*Row adjust data
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15

-0.286707
-0.320707
-0.205207
-0.056907
-0.177307
-0.076707
0.562893
1.506293
-0.361107
0.167993
-0.071307
0.146293
-0.348607
-0.182507
-0.296407

step 1

x4"

x5"

8.2
-3.8
-3.8
-4.8
-7.8
8.2
12.2
-0.8
10.2
8.2
-7.8
5.2
-3.8
-5.8
-13.8

-2867.07
-3207.07
-2052.07
-569.07
-1773.07
-767.07
5628.93
15062.93
-3611.07
1679.93
-713.07
1462.93
-3486.07
-1825.07
-2964.07

-0.00333

3,5
-57494.21
-59181.02
-4623.929
4920.578
6643.161
16988.07
26193.1
97205.61
-78552.69
-11333.98
19286.19
10611.07
-18661.98
30563.9
3398.899
-14037.25
-14037.25

NOT NECESSARY JUST TO CHECK


For plotting
mean
orginal data
0.679
0.64373
0.671
0.64373
0.667
0.64373
0.663
0.64373
0.659
0.64373
0.659
0.64373
0.658
0.64373
0.655
0.64373
0.652
0.64373
0.635
0.64373
0.631
0.64373
0.627
0.64373
0.604
0.64373
0.602
0.64373
0.594
0.64373
sum of mean to check
5E-05 1.42109E-14
4.26326E-14
-0.05

1,1
2,2
3,3
4,4
x1*x1
0.001244
6.5536 402.1348409
0.000744
39.1876 340.5242809
0.000541
40.4496
5.07736089
0.000371
34.3396 74.76542089
0.000233
51.2656 14.03776089
0.000233
69.8896 490.4763209
0.000204 170.0416 21.65320089
0.000127 329.0596 41.64508089
6.84E-05
4.6656 473.2060609
7.62E-05
48.1636 45.51796089
0.000162
68.2276 731.5239809
0.00028 172.6596 52.61036089
0.001578
2.4336 28.65782089
0.001741
3.4596 280.4519609
0.002473
0.7396
1.31492089
0.010077 1041.136 3003.597333
0.010077 1041.136 3003.597333

-1002.7 0.00072 74.3669

214.54267

for three data set

checking with the example


0.69
0.49
-1.31
-1.21
0.39
0.99
0.09
0.29
1.29
1.09
0.49
0.79
0.19
-0.31
-0.81
-0.81
-0.31
-0.31
-0.71
-1.01
sum
avg

5,5
67.24
14.44
14.44
23.04
60.84
67.24
148.84
0.64
104.04
67.24
60.84
27.04
14.44
33.64
190.44
894.4
894.4

8220090.385
10285297.98
4210991.285
323840.6649
3143777.225
588396.3849
31684852.94
226891860.2
13039826.54
2822164.805
508468.8249
2140164.185
12152684.04
3330880.505
8785710.965
328129006.9
328129006.9

63.885714 23437786.2

e first one and so feature vector is

ecking with the example


0.4761
1.7161
0.1521
0.0081
1.6641
0.2401
0.0361
0.6561
0.0961
0.5041
5.549
0.616555556

0.03527
0.02727
0.02327
0.01927
0.01527
0.01527
0.01427
0.01127
0.00827
-0.00873
-0.01273
-0.01673
-0.03973
-0.04173
-0.04973
0
3.33333E-06

0.2401
0.3381
1.4641
1.5851
0.9801
0.3861
0.0841
0.0261
1.1881
1.4061
0.6241
0.3871
0.0961
-0.0589
0.6561
0.6561
0.0961
0.0961
1.0201
0.7171
6.449
5.539
0.6449 0.615444

20.0533 -2867.07 -2867.07


18.4533 -3207.07 -3207.07
2.2533 -2052.07 -2052.07
-8.6467
-569.07
-569.07
-3.7467 -1773.07 -1773.07
-22.1467
-767.07
-767.07
4.6533 5628.93 5628.93
6.4533 15062.93 15062.93
21.7533 -3611.07 -3611.07
-6.7467 1679.93 1679.93
-27.0467
-713.07
-713.07
7.2533 1462.93 1462.93
5.3533 -3486.07 -3486.07
-16.7467 -1825.07 -1825.07
-1.1467 -2964.07 -2964.07
0
-3.3E-05
-0.00333

0.69
0.68
0.67
0.66
0.65
0.64
0.63
0.62
0.61
0.6
0.59
0.58
0

10

15

orginal data
deviation

20

You might also like