0% found this document useful (0 votes)
14 views31 pages

Pinv For Modern ML

Uploaded by

k.s.jagan2005
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views31 pages

Pinv For Modern ML

Uploaded by

k.s.jagan2005
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

Figure : Understanding Modern ML with SVD and Pseudo Inverse

Linear/Non-linear Regression and Pseudo


inverse

Figure : Fit Y=mX+c for the data

Model

Equation of the line is

be SVD of A . We have

; ;

Rank of A is 2.

1
First two column of U span Column Space of A. Let it be

Last two columns of U span Left Null Space of A. Let it be

Vector y is 4-tuple. It can be expressed as Linear combination of column vectors in U, because U span
entire sapce

----------(1)

Note that ; ,

How will you visualize the following

are linear combination of columnsapce basis vecors and it is in column space of A.

Figure 1. . Dot product follwed by Linear combination. (See also Figure 3 at bottom)

2
Figure 2: Subspace projection. In our problem Left null space also is of dimension 2

How will you visualize the following ?

It is about projecting y into left null space of A.

We have , ; It is last two columns of U: Basis set for Left Null Space

. Linear combination of and

On Computation (see the code below) , we get

It is Left null space component of y

This vector is what we call as error vector e .

What is Column space component of y?

. This is what we call fitted y

This vector is in Column space.

3
------------------------------------------

could have been obtained directly as follows:

Let ;It is first two columns of U: Basis set for Column Space

Since , ------(2) has solution.

This solution is unique if columns of A are independent. In our case it is true.

Pre-multiply both sides of (2) by

Equation of the fitted line is.

--------------------------------------

Now let us solve using Pseudo inverse

; We get same result.

------------------------------------------------

What exactly is the pseudo inverse

4
In our example, let .

Therefore

It is same as

. is pseudo inverse of A

A=[1 1; 2 1; 2 1; 3 1];
y=[1 1 2 2]';
% y is not in columspace of A
% y has two parts. yc and ylns
% Let us cut off ylns portion
% Take SVD of A . Rank(A)=2.
% last two columns of U are U_LNS
% let us retrive it and project on to it
[U,S,V]=svd(A)

U = 4×4
-0.2846 0.8179 -0.3000 0.4000

5
-0.4804 0.1385 -0.2657 -0.8243
-0.4804 0.1385 0.8657 0.0243
-0.6762 -0.5410 -0.3000 0.4000
S = 4×2
4.6508 0
0 0.6082
0 0
0 0
V = 2×2
-0.9106 -0.4132
-0.4132 0.9106

U_LNB=U(:,[3 4]);
y_PLN=U_LNB*U_LNB'*y % project on to Left null space

y_PLN = 4×1
-0.0000
-0.5000
0.5000
-0.0000

y_CS=y-y_PLN

y_CS = 4×1
1.0000
1.5000
1.5000
2.0000

% Aw=y_cs; A'*A*w=A'*y_CS
w1=inv(A'*A)*A'*y_CS

w1 = 2×1
0.5000
0.5000

% Let us solve using Pseudo inverse


w2=pinv(A)*y

w2 = 2×1
0.5000
0.5000

Some Notes on Basics of Projection

and express it as linear combination of those unit vectors

6
Figure 3. Project and linearly combine (applicable when bases are orthonormal)

Let us put these vectors as column vectors of matrix A

This matrix A is such that

7
Figure 4: Expressing (1,3) as linear combination of two orthogonal basis vectors

y=[1;3];
a=1/sqrt(2);
A=[a -a; a a];
c=A'*y

c = 2×1
2.8284
1.4142

p1=A(:,1)*c(1) % Projected vector in the direction of b1

p1 = 2×1
2.0000
2.0000

p2=A(:,2)*c(2) % Projected vector in the direction of b2

p2 = 2×1
-1.0000
1.0000

yc=p1+p2 % sum of projected vectors

yc = 2×1
1.0000
3.0000

Vertical Projection on to a plane spanned by two orthogonal vectors

Let the plane be spanned by orthogonal (and unit norm)column vectors of following matrix A

8
Let us assume we want to project on to the plane

To get the coefficients (projected length of the vector in each orthogonal axis , simply dot product the vector
with the two columns. We will get two coefficients.

Projected vector is :

Computational verification

A=(0.5)*[1 1 1 1; 1 -1 -1 1]';
y=[ 1 2 3 8]';
yp=A*A'*y

yp = 4×1

9
4.5000
2.5000
2.5000
4.5000

% remaining part of vector y


ye=y-yp

ye = 4×1
-3.5000
-0.5000
0.5000
3.5000

% this vector is in left null space of A . Verify


ye'*A

ans = 1×2
0 0

% Questions that check you imaginative skills.

1. Create a 4x2 matrix A with rank 2 . Using this matrix create a 4x4 matrix

B with rank 2 by appending A with 2 more dependent columns. Write

the matlab code in 2 steps. All elements should be integers. Find the

rank of B to verify

% A=randi([-5 5],4,2);
% B=[A A*randi([-3 3],2,2)] % append A with 2 columns

B = 4×4
-3 -1 4 -3
2 -5 -14 -15
2 -3 -10 -9
3 5 4 15

% rank(A)

ans = 2

2. Create a 4x2 matrix A with rank 2 . Using this matrix create a 4x4 matrix B with rank 4 by appending two
colums which are orthogonal to first two columns. Write a Matlab code with 2 steps. Verify the rank, it must
be 4.

% A=randi([-5 5],4,2);
% B=[A null(A')] % append A with 2 columns
% rank(B)

3. Create a 4x4 matrix A with rank 2 . Create a 4-tupe vector y and project onto all vector spaces associated
with A using SVD.

n=4;
y=[1 2 3 4]'

y = 4×1

10
1
2
3
4

A=randi([-5 5], n,2)*randi([-5 5], 2,n);


[U, S, V]=svd(A);
r=rank(A); % must be 2
Bc=U(:,1:r); % first 2 columns of U
BL=U(:,r+1:n); % Last 2 colums of U
Br=V(:,1:r); % first 2 columns of V
Brn=V(:,r+1:n);% Last 2 colums of V

Pc=Bc*(Bc'*y); % Projected Vector on to column space


Pl=BL*(BL'*y); % Projected Vector on to Left null space
Pr=Br*(Br'*y); % Projected Vector on to row space
Prn=Brn*(Brn'*y); % Projected Vector on to right null space
% Verification
Pc+Pl % when we add, it must by y

ans = 4×1
1.0000
2.0000
3.0000
4.0000

Pr+Prn % when we add, it must by y

ans = 4×1
1.0000
2.0000
3.0000
4.0000

Non Linear Regression

Given (x,y) points we need to find a poynomial which 'closely ' approximate it .

It shold catch the trend but should not overlearn it. Purpose of fitting is to find y(x) for the given x in the
posiible range of x (interpolation).

11
clf;
x=[0 0.5236 1.0472 1.5708 2.0944 2.6180 3.1416 3.6652
4.1888 4.7124]';
y=[3.8592 2.4988 1.8678 1.4091 1.6355 1.3251 1.9290
1.4403 1.9978 2.3071]';
% Generate data
% rng(12345)
% x=linspace(0,2*pi-pi/2, 10);
% y=2+cos(x)+-1+2*rand(1,length(x));
plot(x,y,'*')
x=x(:); % Make it a colum vector if it is not
y=y(:); % Make it a colum vector if it is not
% Fit a cubic Polynomial
X=[x.^3 x.^2 x.^1 x.^0]

X = 10×4
0 0 0 1.0000
0.1435 0.2742 0.5236 1.0000
1.1484 1.0966 1.0472 1.0000
3.8758 2.4674 1.5708 1.0000
9.1871 4.3865 2.0944 1.0000
17.9436 6.8539 2.6180 1.0000
31.0065 9.8697 3.1416 1.0000
49.2372 13.4337 3.6652 1.0000
73.4969 17.5460 4.1888 1.0000
104.6469 22.2067 4.7124 1.0000

alpha=pinv(X)*y;
ydash=X*alpha;
hold on
plot(x,ydash)

12
Regression that over fit data
Let us use high degree polynomial for fitting the data

Here we use 10th degree polynomial for fitting data given for previous regreesion problem .

clf;
x=[0 0.5236 1.0472 1.5708 2.0944 2.6180 3.1416 3.6652
4.1888 4.7124]';
y=[3.8592 2.4988 1.8678 1.4091 1.6355 1.3251 1.9290
1.4403 1.9978 2.3071]';
plot(x,y,'*')
hold on
x=x(:); % Make it a colum vector if it is not
y=y(:); % Make it a colum vector if it is not
% Fit a cubic Polynomial
X=[x.^10 x.^9 x.^8 x.^7 x.^6 x.^5 x.^4 x.^3 x.^2 x.^1 x.^0];
alpha=pinv(X)*y;
% Actual x is in the range 0 to 4.75
% Let us take 100 points in that range and find function values
xmax=max(x);
x=linspace(0,xmax,100);
% finding function valuesfor x= 0 to xmax
fx=zeros(1,100); %
k=0;
for i=10:-1:0
k=k+1;
fx=fx+alpha(k)*x.^i;
end
plot(x,fx)

13
% Note that the curve is passing through all the points
% Error sum of square is 0 but it is not useful for prediction task.
% especially for larger value of x (near x=4 to 5 )

Non-Linear Regression for Classification

14
Kernel Regression for XOR data

% Kernel Regression for XOR data


clc; clear all
A=[0 0; 0 1;1 0; 1 1];
% 0 0 belongs to class 0
% 0 1 class 1
% 1 0 class 1
% 1 1 class 0
Y=[0; 1; 1; 0];
K=(A*A'+1).^3; % polynomial Kernel for data
% K*alpha ~=Y
alpha=pinv(K)*Y;
% alpha values
% 0.4524
% 0.3095
% 0.3095
% -0.1667
disp("Predicted label for all data")

Predicted label for all data

label=round((A*A'+1).^3*alpha) % same as K*alpha

label = 4×1
0
1
1
0

disp("actual label for all data")

actual label for all data

Y = 4×1
0
1
1

15
0

% How do I do for specific data


x1=[0 0] ; % It belong to Class 0
xpred_label=round((x1*A'+1).^3*alpha)

xpred_label = 0

x2=[0 1] ; % It belong to Class 1


xpred_label=round((x2*A'+1).^3*alpha)

xpred_label = 1

Random Kitchen Sink Algorithm


Method

1. Map the x data points to higher dimension using random matrix from N(0,1). Map class values to
one hot represention. Let it be Y. For each data y label is assumed to be in colums. For n data , n
columns.
2. Store the random matrix for mapping the data points for inference
3. Take Cos and Sin of those mapped data and append to create phi(x). Let it be X. Data are assumed
to be in columns. For n data, n columns
4. Model is WX=Y. W are regression weights
5. Apply regression . W=Y*pinv(X)

clc;
clear all;
close all;
% rng(12345);
%creating training data and testing
N = 10000;
nturns = 4;
angle = linspace(-pi*nturns,pi*nturns,N); % Row vector
radius = 0:(1/(length(angle)-1)):1;% Row vector

x1 = sin(angle).*radius; % Row vector


y1 = cos(angle).*radius;% Row vector
scatter(x1,y1); hold on
data1 = [x1;y1];

x2 = -x1; y2 = -y1;
scatter(x2,y2,'p'); hold off

16
data2=[x2;y2];

%creating training data and testing data


num = 1000; %number of data used for training form each class (give this
as an even number)
train_loc=randsample(N,num);
test_loc = setdiff((1:N),train_loc);

train = [data1(:,train_loc) data2(:,train_loc)]; %data is arranged


column wise
Y_train = [repmat([1;0],1,num) repmat([0;1],1,num)]; %class labels for
train data

test = [data1(:,test_loc) data2(:,test_loc)];


Y_test = [repmat([1;0],1,N-num) repmat([0;1],1,N-num)]; %class labels for
test data

% Random kitchen sink


dim = 100;
randn('seed',1256);
R = sqrt(dim/2)*randn(dim,2);
RKS = R*train;
RKS = [cos(RKS); sin(RKS)];

%gurls
%lamda = 0;
%W = Y_train*RKS'*inv(RKS*RKS'+lamda*eye(2*dim));
W = Y_train*RKS'*pinv(RKS*RKS');

%checking with the test data


RKS_test = R*test;
RKS_test = [cos(RKS_test); sin(RKS_test)];

17
Y_new = round(abs(W*RKS_test));
result = sum(sum(Y_new==Y_test)==2);
display('percentage accuracy')

percentage accuracy

display(result/(N-num)*100/2)

95.1778

Gaussian Process Regression

Weights B are from N(0,1) , Weights A is to be obtained through regression.

Example

% Gaussian Process Regression


clc; clear all
X=[0 0; 0 1;1 0; 1 1];
% 0 0 belongs to class 0
% 0 1 class 1
% 1 0 class 1
% 1 1 class 0
Y=[0; 1; 1; 0];
% Let us map to 10d space and apply cos as elementwise opeartion
% in Middle layer
B=randn(2,10);
k=cos(X*B); % map the data and apply cos
K=k*k'; % Create Kernel
%Kw ~=y % Final model
w=pinv(K)*Y; % neural weight in last layer
class_lable_predicted =round(K*w) % check whether learned or not

class_lable_predicted = 4×1
0

18
1
1
0

Neural Tangent Kernel (NTK)

NTK Algorithm is based on

Computation for XOR data

19
20
% Neural Tangent Kernel
clc; clear all
X=[0 0; 0 1;1 0; 1 1];
Y=[0; 1; 1; 0];
n=30; % neurons in middle layer: width of network
B=randn(2,n);
A=randn(n,1);
IP_m=X*B ; ; % compute input to Middle layer NN
k=cos(IP_m+pi/4) ; % data output at middle neurones
% map the data and apply cos .
%Mapped data is row vector
output0=k*A;
k1=[];
for i=1:4
k0=[k(i,:) -sin(IP_m(i,:)+pi/4).*(A')*X(i,1) -sin(IP_m(i,:)+pi/4).*A'*X(i,2)];
k1=[k1;k0];
end
w1=pinv(k1)*(Y-output0);
w0=[A' B(1,:) B(2,:)]';
w=w1+w0;
A1=w(1:n);
B1=[w(n+1:2*n)'; w(2*n+1:3*n)'];
IP_m=X*B1 ;
k=cos(IP_m+pi/4);
youtput=round(k*A1)

youtput = 4×1
0
1
1
0

% Part of Breast cancer data set


clear all;clc
A=[ 1 17.99 10.38 122.8 1001 0.1184 0.2776 0.3001
0.1471 0.2419 0.07871 1.095 0.9053 8.589 153.4 0.006399

21
0.04904 0.05373 0.01587 0.03003 0.006193 25.38 17.33
184.6 2019 0.1622 0.6656 0.7119 0.2654 0.4601 0.1189 ;
1 20.57 17.77 132.9 1326 0.08474 0.07864 0.0869
0.07017 0.1812 0.05667 0.5435 0.7339 3.398 74.08
0.005225 0.01308 0.0186 0.0134 0.01389 0.003532 24.99
23.41 158.8 1956 0.1238 0.1866 0.2416 0.186 0.275
0.08902 ;
1 19.69 21.25 130 1203 0.1096 0.1599 0.1974 0.1279
0.2069 0.05999 0.7456 0.7869 4.585 94.03 0.00615 0.04006
0.03832 0.02058 0.0225 0.004571 23.57 25.53 152.5 1709
0.1444 0.4245 0.4504 0.243 0.3613 0.08758 ;
1 11.42 20.38 77.58 386.1 0.1425 0.2839 0.2414 0.1052
0.2597 0.09744 0.4956 1.156 3.445 27.23 0.00911 0.07458
0.05661 0.01867 0.05963 0.009208 14.91 26.5 98.87 567.7
0.2098 0.8663 0.6869 0.2575 0.6638 0.173 ;
1 20.29 14.34 135.1 1297 0.1003 0.1328 0.198 0.1043
0.1809 0.05883 0.7572 0.7813 5.438 94.44 0.01149 0.02461
0.05688 0.01885 0.01756 0.005115 22.54 16.67 152.2 1575
0.1374 0.205 0.4 0.1625 0.2364 0.07678 ;
1 12.45 15.7 82.57 477.1 0.1278 0.17 0.1578 0.08089
0.2087 0.07613 0.3345 0.8902 2.217 27.19 0.00751 0.03345
0.03672 0.01137 0.02165 0.005082 15.47 23.75 103.4 741.6
0.1791 0.5249 0.5355 0.1741 0.3985 0.1244 ;
1 18.25 19.98 119.6 1040 0.09463 0.109 0.1127 0.074
0.1794 0.05742 0.4467 0.7732 3.18 53.91 0.004314 0.01382
0.02254 0.01039 0.01369 0.002179 22.88 27.66 153.2 1606
0.1442 0.2576 0.3784 0.1932 0.3063 0.08368 ;
1 13.71 20.83 90.2 577.9 0.1189 0.1645 0.09366
0.05985 0.2196 0.07451 0.5835 1.377 3.856 50.96 0.008805
0.03029 0.02488 0.01448 0.01486 0.005412 17.06 28.14
110.6 897 0.1654 0.3682 0.2678 0.1556 0.3196 0.1151 ;
1 13 21.82 87.5 519.8 0.1273 0.1932 0.1859 0.09353
0.235 0.07389 0.3063 1.002 2.406 24.32 0.005731 0.03502
0.03553 0.01226 0.02143 0.003749 15.49 30.73 106.2 739.3
0.1703 0.5401 0.539 0.206 0.4378 0.1072 ;
1 12.46 24.04 83.97 475.9 0.1186 0.2396 0.2273
0.08543 0.203 0.08243 0.2976 1.599 2.039 23.94 0.007149
0.07217 0.07743 0.01432 0.01789 0.01008 15.09 40.68
97.65 711.4 0.1853 1.058 1.105 0.221 0.4366 0.2075 ;
1 16.02 23.24 102.7 797.8 0.08206 0.06669 0.03299
0.03323 0.1528 0.05697 0.3795 1.187 2.466 40.51 0.004029
0.009269 0.01101 0.007591 0.0146 0.003042 19.19 33.88
123.8 1150 0.1181 0.1551 0.1459 0.09975 0.2948 0.08452 ;
1 15.78 17.89 103.6 781 0.0971 0.1292 0.09954 0.06606
0.1842 0.06082 0.5058 0.9849 3.564 54.16 0.005771
0.04061 0.02791 0.01282 0.02008 0.004144 20.42 27.28
136.5 1299 0.1396 0.5609 0.3965 0.181 0.3792 0.1048 ;
1 19.17 24.8 132.4 1123 0.0974 0.2458 0.2065 0.1118
0.2397 0.078 0.9555 3.568 11.07 116.2 0.003139 0.08297
0.0889 0.0409 0.04484 0.01284 20.96 29.94 151.7 1332
0.1037 0.3903 0.3639 0.1767 0.3176 0.1023 ;
1 15.85 23.95 103.7 782.7 0.08401 0.1002 0.09938
0.05364 0.1847 0.05338 0.4033 1.078 2.903 36.58 0.009769

22
0.03126 0.05051 0.01992 0.02981 0.003002 16.84 27.66 112
876.5 0.1131 0.1924 0.2322 0.1119 0.2809 0.06287 ;
1 13.73 22.61 93.6 578.3 0.1131 0.2293 0.2128 0.08025
0.2069 0.07682 0.2121 1.169 2.061 19.21 0.006429 0.05936
0.05501 0.01628 0.01961 0.008093 15.03 32.01 108.8 697.7
0.1651 0.7725 0.6943 0.2208 0.3596 0.1431 ;
1 14.54 27.54 96.73 658.8 0.1139 0.1595 0.1639
0.07364 0.2303 0.07077 0.37 1.033 2.879 32.55 0.005607
0.0424 0.04741 0.0109 0.01857 0.005466 17.46 37.13 124.1
943.2 0.1678 0.6577 0.7026 0.1712 0.4218 0.1341 ;
1 14.68 20.13 94.74 684.5 0.09867 0.072 0.07395
0.05259 0.1586 0.05922 0.4727 1.24 3.195 45.4 0.005718
0.01162 0.01998 0.01109 0.0141 0.002085 19.07 30.88
123.4 1138 0.1464 0.1871 0.2914 0.1609 0.3029 0.08216 ;
1 16.13 20.68 108.1 798.8 0.117 0.2022 0.1722 0.1028
0.2164 0.07356 0.5692 1.073 3.854 54.18 0.007026 0.02501
0.03188 0.01297 0.01689 0.004142 20.96 31.48 136.8 1315
0.1789 0.4233 0.4784 0.2073 0.3706 0.1142 ;
1 19.81 22.15 130 1260 0.09831 0.1027 0.1479 0.09498
0.1582 0.05395 0.7582 1.017 5.865 112.4 0.006494 0.01893
0.03391 0.01521 0.01356 0.001997 27.32 30.88 186.8 2398
0.1512 0.315 0.5372 0.2388 0.2768 0.07615 ;
0 13.54 14.36 87.46 566.3 0.09779 0.08129 0.06664
0.04781 0.1885 0.05766 0.2699 0.7886 2.058 23.56
0.008462 0.0146 0.02387 0.01315 0.0198 0.0023 15.11
19.26 99.7 711.2 0.144 0.1773 0.239 0.1288 0.2977
0.07259 ;
0 13.08 15.71 85.63 520 0.1075 0.127 0.04568 0.0311
0.1967 0.06811 0.1852 0.7477 1.383 14.67 0.004097
0.01898 0.01698 0.00649 0.01678 0.002425 14.5 20.49
96.09 630.5 0.1312 0.2776 0.189 0.07283 0.3184 0.08183 ;
0 9.504 12.44 60.34 273.9 0.1024 0.06492 0.02956
0.02076 0.1815 0.06905 0.2773 0.9768 1.909 15.7 0.009606
0.01432 0.01985 0.01421 0.02027 0.002968 10.23 15.66
65.13 314.9 0.1324 0.1148 0.08867 0.06227 0.245 0.07773 ;
1 15.34 14.26 102.5 704.4 0.1073 0.2135 0.2077
0.09756 0.2521 0.07032 0.4388 0.7096 3.384 44.91
0.006789 0.05328 0.06446 0.02252 0.03672 0.004394 18.07
19.08 125.1 980.9 0.139 0.5954 0.6305 0.2393 0.4667
0.09946 ;
1 21.16 23.04 137.2 1404 0.09428 0.1022 0.1097
0.08632 0.1769 0.05278 0.6917 1.127 4.303 93.99 0.004728
0.01259 0.01715 0.01038 0.01083 0.001987 29.17 35.59 188
2615 0.1401 0.26 0.3155 0.2009 0.2822 0.07526 ;
1 16.65 21.38 110 904.6 0.1121 0.1457 0.1525 0.0917
0.1995 0.0633 0.8068 0.9017 5.455 102.6 0.006048 0.01882
0.02741 0.0113 0.01468 0.002801 26.46 31.56 177 2215
0.1805 0.3578 0.4695 0.2095 0.3613 0.09564 ;
1 17.14 16.4 116 912.7 0.1186 0.2276 0.2229 0.1401
0.304 0.07413 1.046 0.976 7.276 111.4 0.008029 0.03799
0.03732 0.02397 0.02308 0.007444 22.25 21.4 152.4 1461
0.1545 0.3949 0.3853 0.255 0.4066 0.1059 ;
1 14.58 21.53 97.41 644.8 0.1054 0.1868 0.1425
0.08783 0.2252 0.06924 0.2545 0.9832 2.11 21.05 0.004452

23
0.03055 0.02681 0.01352 0.01454 0.003711 17.62 33.21
122.4 896.9 0.1525 0.6643 0.5539 0.2701 0.4264 0.1275 ;
1 18.61 20.25 122.1 1094 0.0944 0.1066 0.149 0.07731
0.1697 0.05699 0.8529 1.849 5.632 93.54 0.01075 0.02722
0.05081 0.01911 0.02293 0.004217 21.31 27.26 139.9 1403
0.1338 0.2117 0.3446 0.149 0.2341 0.07421 ;
1 15.3 25.27 102.4 732.4 0.1082 0.1697 0.1683 0.08751
0.1926 0.0654 0.439 1.012 3.498 43.5 0.005233 0.03057
0.03576 0.01083 0.01768 0.002967 20.27 36.71 149.3 1269
0.1641 0.611 0.6335 0.2024 0.4027 0.09876 ;
1 17.57 15.05 115 955.1 0.09847 0.1157 0.09875
0.07953 0.1739 0.06149 0.6003 0.8225 4.655 61.1 0.005627
0.03033 0.03407 0.01354 0.01925 0.003742 20.01 19.52
134.9 1227 0.1255 0.2812 0.2489 0.1456 0.2756 0.07919 ;
1 18.63 25.11 124.8 1088 0.1064 0.1887 0.2319 0.1244
0.2183 0.06197 0.8307 1.466 5.574 105 0.006248 0.03374
0.05196 0.01158 0.02007 0.00456 23.15 34.01 160.5 1670
0.1491 0.4257 0.6133 0.1848 0.3444 0.09782 ;
1 11.84 18.7 77.93 440.6 0.1109 0.1516 0.1218 0.05182
0.2301 0.07799 0.4825 1.03 3.475 41 0.005551 0.03414
0.04205 0.01044 0.02273 0.005667 16.82 28.12 119.4 888.7
0.1637 0.5775 0.6956 0.1546 0.4761 0.1402 ;
1 17.02 23.98 112.8 899.3 0.1197 0.1496 0.2417 0.1203
0.2248 0.06382 0.6009 1.398 3.999 67.78 0.008268 0.03082
0.05042 0.01112 0.02102 0.003854 20.88 32.09 136.1 1344
0.1634 0.3559 0.5588 0.1847 0.353 0.08482 ;
1 19.27 26.47 127.9 1162 0.09401 0.1719 0.1657
0.07593 0.1853 0.06261 0.5558 0.6062 3.528 68.17
0.005015 0.03318 0.03497 0.009643 0.01543 0.003896 24.15
30.9 161.4 1813 0.1509 0.659 0.6091 0.1785 0.3672
0.1123 ;
1 16.13 17.88 107 807.2 0.104 0.1559 0.1354 0.07752
0.1998 0.06515 0.334 0.6857 2.183 35.03 0.004185 0.02868
0.02664 0.009067 0.01703 0.003817 20.21 27.26 132.7 1261
0.1446 0.5804 0.5274 0.1864 0.427 0.1233 ;
1 16.74 21.59 110.1 869.5 0.0961 0.1336 0.1348
0.06018 0.1896 0.05656 0.4615 0.9197 3.008 45.19
0.005776 0.02499 0.03695 0.01195 0.02789 0.002665 20.01
29.02 133.5 1229 0.1563 0.3835 0.5409 0.1813 0.4863
0.08633 ;
1 14.25 21.72 93.63 633 0.09823 0.1098 0.1319 0.05598
0.1885 0.06125 0.286 1.019 2.657 24.91 0.005878 0.02995
0.04815 0.01161 0.02028 0.004022 15.89 30.36 116.2 799.6
0.1446 0.4238 0.5186 0.1447 0.3591 0.1014 ;
0 13.03 18.42 82.61 523.8 0.08983 0.03766 0.02562
0.02923 0.1467 0.05863 0.1839 2.342 1.17 14.16 0.004352
0.004899 0.01343 0.01164 0.02671 0.001777 13.3 22.81
84.46 545.9 0.09701 0.04619 0.04833 0.05013 0.1987
0.06169 ;
1 14.99 25.2 95.54 698.8 0.09387 0.05131 0.02398
0.02899 0.1565 0.05504 1.214 2.188 8.077 106 0.006883
0.01094 0.01818 0.01917 0.007882 0.001754 14.99 25.2
95.54 698.8 0.09387 0.05131 0.02398 0.02899 0.1565
0.05504 ;

24
1 13.48 20.82 88.4 559.2 0.1016 0.1255 0.1063 0.05439
0.172 0.06419 0.213 0.5914 1.545 18.52 0.005367 0.02239
0.03049 0.01262 0.01377 0.003187 15.53 26.02 107.3 740.4
0.161 0.4225 0.503 0.2258 0.2807 0.1071 ;
1 13.44 21.58 86.18 563 0.08162 0.06031 0.0311
0.02031 0.1784 0.05587 0.2385 0.8265 1.572 20.53 0.00328
0.01102 0.0139 0.006881 0.0138 0.001286 15.93 30.25
102.5 787.9 0.1094 0.2043 0.2085 0.1112 0.2994 0.07146 ;
1 10.95 21.35 71.9 371.1 0.1227 0.1218 0.1044 0.05669
0.1895 0.0687 0.2366 1.428 1.822 16.97 0.008064 0.01764
0.02595 0.01037 0.01357 0.00304 12.84 35.34 87.22 514
0.1909 0.2698 0.4023 0.1424 0.2964 0.09606 ;
1 19.07 24.81 128.3 1104 0.09081 0.219 0.2107 0.09961
0.231 0.06343 0.9811 1.666 8.83 104.9 0.006548 0.1006
0.09723 0.02638 0.05333 0.007646 24.09 33.17 177.4 1651
0.1247 0.7444 0.7242 0.2493 0.467 0.1038 ;
1 13.28 20.28 87.32 545.2 0.1041 0.1436 0.09847
0.06158 0.1974 0.06782 0.3704 0.8249 2.427 31.33
0.005072 0.02147 0.02185 0.00956 0.01719 0.003317 17.38
28 113.1 907.2 0.153 0.3724 0.3664 0.1492 0.3739
0.1027 ;
1 13.17 21.81 85.42 531.5 0.09714 0.1047 0.08259
0.05252 0.1746 0.06177 0.1938 0.6123 1.334 14.49 0.00335
0.01384 0.01452 0.006853 0.01113 0.00172 16.23 29.89
105.5 740.7 0.1503 0.3904 0.3728 0.1607 0.3693 0.09618 ;
1 18.65 17.6 123.7 1076 0.1099 0.1686 0.1974 0.1009
0.1907 0.06049 0.6289 0.6633 4.293 71.56 0.006294
0.03994 0.05554 0.01695 0.02428 0.003535 22.82 21.32
150.6 1567 0.1679 0.509 0.7345 0.2378 0.3799 0.09185 ;
0 8.196 16.84 51.71 201.9 0.086 0.05943 0.01588
0.005917 0.1769 0.06503 0.1563 0.9567 1.094 8.205
0.008968 0.01646 0.01588 0.005917 0.02574 0.002582 8.964
21.96 57.26 242.2 0.1297 0.1357 0.0688 0.02564 0.3105
0.07409 ;
1 13.17 18.66 85.98 534.6 0.1158 0.1231 0.1226
0.0734 0.2128 0.06777 0.2871 0.8937 1.897 24.25 0.006532
0.02336 0.02905 0.01215 0.01743 0.003643 15.67 27.95
102.8 759.4 0.1786 0.4166 0.5006 0.2088 0.39 0.1179 ;
0 12.05 14.63 78.04 449.3 0.1031 0.09092 0.06592
0.02749 0.1675 0.06043 0.2636 0.7294 1.848 19.87
0.005488 0.01427 0.02322 0.00566 0.01428 0.002422 13.76
20.7 89.88 582.6 0.1494 0.2156 0.305 0.06548 0.2747
0.08301 ;
0 13.49 22.3 86.91 561 0.08752 0.07698 0.04751
0.03384 0.1809 0.05718 0.2338 1.353 1.735 20.2 0.004455
0.01382 0.02095 0.01184 0.01641 0.001956 15.15 31.82 99
698.8 0.1162 0.1711 0.2282 0.1282 0.2871 0.06917 ;
0 11.76 21.6 74.72 427.9 0.08637 0.04966 0.01657
0.01115 0.1495 0.05888 0.4062 1.21 2.635 28.47 0.005857
0.009758 0.01168 0.007445 0.02406 0.001769 12.98 25.72
82.98 516.5 0.1085 0.08615 0.05523 0.03715 0.2433
0.06563 ;
0 13.64 16.34 87.21 571.8 0.07685 0.06059 0.01857
0.01723 0.1353 0.05953 0.1872 0.9234 1.449 14.55

25
0.004477 0.01177 0.01079 0.007956 0.01325 0.002551 14.67
23.19 96.08 656.7 0.1089 0.1582 0.105 0.08586 0.2346
0.08025 ;
0 11.94 18.24 75.71 437.6 0.08261 0.04751 0.01972
0.01349 0.1868 0.0611 0.2273 0.6329 1.52 17.47 0.00721
0.00838 0.01311 0.008 0.01996 0.002635 13.1 21.33 83.67
527.2 0.1144 0.08906 0.09203 0.06296 0.2785 0.07408 ;
1 18.22 18.7 120.3 1033 0.1148 0.1485 0.1772 0.106
0.2092 0.0631 0.8337 1.593 4.877 98.81 0.003899 0.02961
0.02817 0.009222 0.02674 0.005126 20.6 24.13 135.1 1321
0.128 0.2297 0.2623 0.1325 0.3021 0.07987 ;
1 15.1 22.02 97.26 712.8 0.09056 0.07081 0.05253
0.03334 0.1616 0.05684 0.3105 0.8339 2.097 29.91
0.004675 0.0103 0.01603 0.009222 0.01095 0.001629 18.1
31.69 117.7 1030 0.1389 0.2057 0.2712 0.153 0.2675
0.07873 ;
0 11.52 18.75 73.34 409 0.09524 0.05473 0.03036
0.02278 0.192 0.05907 0.3249 0.9591 2.183 23.47 0.008328
0.008722 0.01349 0.00867 0.03218 0.002386 12.84 22.47
81.81 506.2 0.1249 0.0872 0.09076 0.06316 0.3306
0.07036 ;
1 19.21 18.57 125.5 1152 0.1053 0.1267 0.1323 0.08994
0.1917 0.05961 0.7275 1.193 4.837 102.5 0.006458 0.02306
0.02945 0.01538 0.01852 0.002608 26.14 28.14 170.1 2145
0.1624 0.3511 0.3879 0.2091 0.3537 0.08294 ;
1 14.71 21.59 95.55 656.9 0.1137 0.1365 0.1293
0.08123 0.2027 0.06758 0.4226 1.15 2.735 40.09 0.003659
0.02855 0.02572 0.01272 0.01817 0.004108 17.87 30.7
115.7 985.5 0.1368 0.429 0.3587 0.1834 0.3698 0.1094 ;
0 13.05 19.31 82.61 527.2 0.0806 0.03789 0.000692
0.004167 0.1819 0.05501 0.404 1.214 2.595 32.96 0.007491
0.008593 0.000692 0.004167 0.0219 0.00299 14.23 22.25
90.24 624.1 0.1021 0.06191 0.001845 0.01111 0.2439
0.06289 ;
0 8.618 11.79 54.34 224.5 0.09752 0.05272 0.02061
0.007799 0.1683 0.07187 0.1559 0.5796 1.046 8.322
0.01011 0.01055 0.01981 0.005742 0.0209 0.002788 9.507
15.4 59.9 274.9 0.1733 0.1239 0.1168 0.04419 0.322
0.09026 ;
0 10.17 14.88 64.55 311.9 0.1134 0.08061 0.01084
0.0129 0.2743 0.0696 0.5158 1.441 3.312 34.62 0.007514
0.01099 0.007665 0.008193 0.04183 0.005953 11.02 17.45
69.86 368.6 0.1275 0.09866 0.02168 0.02579 0.3557
0.0802 ;
0 8.598 20.98 54.66 221.8 0.1243 0.08963 0.03
0.009259 0.1828 0.06757 0.3582 2.067 2.493 18.39 0.01193
0.03162 0.03 0.009259 0.03357 0.003048 9.565 27.04 62.06
273.9 0.1639 0.1698 0.09001 0.02778 0.2972 0.07712 ;
1 14.25 22.15 96.42 645.7 0.1049 0.2008 0.2135
0.08653 0.1949 0.07292 0.7036 1.268 5.373 60.78 0.009407
0.07056 0.06899 0.01848 0.017 0.006113 17.67 29.51 119.1
959.5 0.164 0.6247 0.6922 0.1785 0.2844 0.1132 ;
0 9.173 13.86 59.2 260.9 0.07721 0.08751 0.05988
0.0218 0.2341 0.06963 0.4098 2.265 2.608 23.52 0.008738

26
0.03938 0.04312 0.0156 0.04192 0.005822 10.01 19.23
65.59 310.1 0.09836 0.1678 0.1397 0.05087 0.3282 0.0849 ;
1 12.68 23.84 82.69 499 0.1122 0.1262 0.1128 0.06873
0.1905 0.0659 0.4255 1.178 2.927 36.46 0.007781 0.02648
0.02973 0.0129 0.01635 0.003601 17.09 33.47 111.8 888.3
0.1851 0.4061 0.4024 0.1716 0.3383 0.1031 ;
1 14.78 23.94 97.4 668.3 0.1172 0.1479 0.1267 0.09029
0.1953 0.06654 0.3577 1.281 2.45 35.24 0.006703 0.0231
0.02315 0.01184 0.019 0.003224 17.31 33.39 114.6 925.1
0.1648 0.3416 0.3024 0.1614 0.3321 0.08911 ;
0 9.465 21.01 60.11 269.4 0.1044 0.07773 0.02172
0.01504 0.1717 0.06899 0.2351 2.011 1.66 14.2 0.01052
0.01755 0.01714 0.009333 0.02279 0.004237 10.41 31.56
67.03 330.7 0.1548 0.1664 0.09412 0.06517 0.2878
0.09211 ;
0 11.31 19.04 71.8 394.1 0.08139 0.04701 0.03709
0.0223 0.1516 0.05667 0.2727 0.9429 1.831 18.15 0.009282
0.009216 0.02063 0.008965 0.02183 0.002146 12.33 23.84
78 466.7 0.129 0.09148 0.1444 0.06961 0.24 0.06641 ;
0 9.029 17.33 58.79 250.5 0.1066 0.1413 0.313 0.04375
0.2111 0.08046 0.3274 1.194 1.885 17.67 0.009549 0.08606
0.3038 0.03322 0.04197 0.009559 10.31 22.65 65.5 324.7
0.1482 0.4365 1.252 0.175 0.4228 0.1175 ;
0 12.78 16.49 81.37 502.5 0.09831 0.05234 0.03653
0.02864 0.159 0.05653 0.2368 0.8732 1.471 18.33 0.007962
0.005612 0.01585 0.008662 0.02254 0.001906 13.46 19.76
85.67 554.9 0.1296 0.07061 0.1039 0.05882 0.2383 0.0641 ;
1 18.94 21.31 123.6 1130 0.09009 0.1029 0.108 0.07951
0.1582 0.05461 0.7888 0.7975 5.486 96.05 0.004444
0.01652 0.02269 0.0137 0.01386 0.001698 24.86 26.58
165.9 1866 0.1193 0.2336 0.2687 0.1789 0.2551 0.06589 ;
0 8.888 14.64 58.79 244 0.09783 0.1531 0.08606
0.02872 0.1902 0.0898 0.5262 0.8522 3.168 25.44 0.01721
0.09368 0.05671 0.01766 0.02541 0.02193 9.733 15.67
62.56 284.4 0.1207 0.2436 0.1434 0.04786 0.2254 0.1084 ;
1 17.2 24.52 114.2 929.4 0.1071 0.183 0.1692 0.07944
0.1927 0.06487 0.5907 1.041 3.705 69.47 0.00582 0.05616
0.04252 0.01127 0.01527 0.006299 23.32 33.82 151.6 1681
0.1585 0.7394 0.6566 0.1899 0.3313 0.1339 ;
1 13.8 15.79 90.43 584.1 0.1007 0.128 0.07789 0.05069
0.1662 0.06566 0.2787 0.6205 1.957 23.35 0.004717
0.02065 0.01759 0.009206 0.0122 0.00313 16.57 20.86
110.3 812.4 0.1411 0.3542 0.2779 0.1383 0.2589 0.103 ;
0 12.31 16.52 79.19 470.9 0.09172 0.06829 0.03372
0.02272 0.172 0.05914 0.2505 1.025 1.74 19.68 0.004854
0.01819 0.01826 0.007965 0.01386 0.002304 14.11 23.21
89.71 611.1 0.1176 0.1843 0.1703 0.0866 0.2618 0.07609 ;
1 16.07 19.65 104.1 817.7 0.09168 0.08424 0.09769
0.06638 0.1798 0.05391 0.7474 1.016 5.029 79.25 0.01082
0.02203 0.035 0.01809 0.0155 0.001948 19.77 24.56 128.8
1223 0.15 0.2045 0.2829 0.152 0.265 0.06387 ;
0 13.53 10.94 87.91 559.2 0.1291 0.1047 0.06877
0.06556 0.2403 0.06641 0.4101 1.014 2.652 32.65 0.0134

27
0.02839 0.01162 0.008239 0.02572 0.006164 14.08 12.49
91.36 605.5 0.1451 0.1379 0.08539 0.07407 0.271 0.07191 ;
1 18.05 16.15 120.2 1006 0.1065 0.2146 0.1684 0.108
0.2152 0.06673 0.9806 0.5505 6.311 134.8 0.00794 0.05839
0.04658 0.0207 0.02591 0.007054 22.39 18.91 150.1 1610
0.1478 0.5634 0.3786 0.2102 0.3751 0.1108 ;
1 20.18 23.97 143.7 1245 0.1286 0.3454 0.3754 0.1604
0.2906 0.08142 0.9317 1.885 8.649 116.4 0.01038 0.06835
0.1091 0.02593 0.07895 0.005987 23.37 31.72 170.3 1623
0.1639 0.6164 0.7681 0.2508 0.544 0.09964 ;
0 12.86 18 83.19 506.3 0.09934 0.09546 0.03889
0.02315 0.1718 0.05997 0.2655 1.095 1.778 20.35 0.005293
0.01661 0.02071 0.008179 0.01748 0.002848 14.24 24.82
91.88 622.1 0.1289 0.2141 0.1731 0.07926 0.2779 0.07918 ;
0 11.45 20.97 73.81 401.5 0.1102 0.09362 0.04591
0.02233 0.1842 0.07005 0.3251 2.174 2.077 24.62 0.01037
0.01706 0.02586 0.007506 0.01816 0.003976 13.11 32.16
84.53 525.1 0.1557 0.1676 0.1755 0.06127 0.2762 0.08851 ;
0 13.34 15.86 86.49 520 0.1078 0.1535 0.1169 0.06987
0.1942 0.06902 0.286 1.016 1.535 12.96 0.006794 0.03575
0.0398 0.01383 0.02134 0.004603 15.53 23.19 96.66 614.9
0.1536 0.4791 0.4858 0.1708 0.3527 0.1016 ;
1 25.22 24.91 171.5 1878 0.1063 0.2665 0.3339 0.1845
0.1829 0.06782 0.8973 1.474 7.382 120 0.008166 0.05693
0.0573 0.0203 0.01065 0.005893 30 33.62 211.7 2562
0.1573 0.6076 0.6476 0.2867 0.2355 0.1051 ;
1 19.1 26.29 129.1 1132 0.1215 0.1791 0.1937 0.1469
0.1634 0.07224 0.519 2.91 5.801 67.1 0.007545 0.0605
0.02134 0.01843 0.03056 0.01039 20.33 32.72 141.3 1298
0.1392 0.2817 0.2432 0.1841 0.2311 0.09203 ;
0 12 15.65 76.95 443.3 0.09723 0.07165 0.04151
0.01863 0.2079 0.05968 0.2271 1.255 1.441 16.16 0.005969
0.01812 0.02007 0.007027 0.01972 0.002607 13.67 24.9
87.78 567.9 0.1377 0.2003 0.2267 0.07632 0.3379 0.07924 ;
1 18.46 18.52 121.1 1075 0.09874 0.1053 0.1335
0.08795 0.2132 0.06022 0.6997 1.475 4.782 80.6 0.006471
0.01649 0.02806 0.0142 0.0237 0.003755 22.93 27.68 152.2
1603 0.1398 0.2089 0.3157 0.1642 0.3695 0.08579 ;
1 14.48 21.46 94.25 648.2 0.09444 0.09947 0.1204
0.04938 0.2075 0.05636 0.4204 2.22 3.301 38.87 0.009369
0.02983 0.05371 0.01761 0.02418 0.003249 16.21 29.25
108.4 808.9 0.1306 0.1976 0.3349 0.1225 0.302 0.06846 ;
1 19.02 24.59 122 1076 0.09029 0.1206 0.1468 0.08271
0.1953 0.05629 0.5495 0.6636 3.055 57.65 0.003872
0.01842 0.0371 0.012 0.01964 0.003337 24.56 30.41 152.9
1623 0.1249 0.3206 0.5755 0.1956 0.3956 0.09288 ;
0 12.36 21.8 79.78 466.1 0.08772 0.09445 0.06015
0.03745 0.193 0.06404 0.2978 1.502 2.203 20.95 0.007112
0.02493 0.02703 0.01293 0.01958 0.004463 13.83 30.5
91.46 574.7 0.1304 0.2463 0.2434 0.1205 0.2972 0.09261 ;
0 14.64 15.24 95.77 651.9 0.1132 0.1339 0.09966
0.07064 0.2116 0.06346 0.5115 0.7372 3.814 42.76
0.005508 0.04412 0.04436 0.01623 0.02427 0.004841 16.34

28
18.24 109.4 803.6 0.1277 0.3089 0.2604 0.1397 0.3151
0.08473 ;
0 14.62 24.02 94.57 662.7 0.08974 0.08606 0.03102
0.02957 0.1685 0.05866 0.3721 1.111 2.279 33.76 0.004868
0.01818 0.01121 0.008606 0.02085 0.002893 16.11 29.11
102.9 803.7 0.1115 0.1766 0.09189 0.06946 0.2522
0.07246 ;
1 15.37 22.76 100.2 728.2 0.092 0.1036 0.1122 0.07483
0.1717 0.06097 0.3129 0.8413 2.075 29.44 0.009882
0.02444 0.04531 0.01763 0.02471 0.002142 16.43 25.84
107.5 830.9 0.1257 0.1997 0.2846 0.1476 0.2556 0.06828 ;
0 13.27 14.76 84.74 551.7 0.07355 0.05055 0.03261
0.02648 0.1386 0.05318 0.4057 1.153 2.701 36.35 0.004481
0.01038 0.01358 0.01082 0.01069 0.001435 16.36 22.35
104.5 830.6 0.1006 0.1238 0.135 0.1001 0.2027 0.06206 ;
0 13.45 18.3 86.6 555.1 0.1022 0.08165 0.03974 0.0278
0.1638 0.0571 0.295 1.373 2.099 25.22 0.005884 0.01491
0.01872 0.009366 0.01884 0.001817 15.1 25.94 97.59 699.4
0.1339 0.1751 0.1381 0.07911 0.2678 0.06603 ;
1 15.06 19.83 100.3 705.6 0.1039 0.1553 0.17 0.08815
0.1855 0.06284 0.4768 0.9644 3.706 47.14 0.00925 0.03715
0.04867 0.01851 0.01498 0.00352 18.23 24.23 123.5 1025
0.1551 0.4203 0.5203 0.2115 0.2834 0.08234 ;
1 20.26 23.03 132.4 1264 0.09078 0.1313 0.1465
0.08683 0.2095 0.05649 0.7576 1.509 4.554 87.87 0.006016
0.03482 0.04232 0.01269 0.02657 0.004411 24.22 31.59
156.1 1750 0.119 0.3539 0.4098 0.1573 0.3689 0.08368 ;
0 12.18 17.84 77.79 451.1 0.1045 0.07057 0.0249
0.02941 0.19 0.06635 0.3661 1.511 2.41 24.44 0.005433
0.01179 0.01131 0.01519 0.0222 0.003408 12.83 20.92
82.14 495.2 0.114 0.09358 0.0498 0.05882 0.2227 0.07376 ;
0 9.787 19.94 62.11 294.5 0.1024 0.05301 0.006829
0.007937 0.135 0.0689 0.335 2.043 2.132 20.05 0.01113
0.01463 0.005308 0.00525 0.01801 0.005667 10.92 26.29
68.81 366.1 0.1316 0.09473 0.02049 0.02381 0.1934
0.08988 ;
0 11.6 12.84 74.34 412.6 0.08983 0.07525 0.04196
0.0335 0.162 0.06582 0.2315 0.5391 1.475 15.75 0.006153
0.0133 0.01693 0.006884 0.01651 0.002551 13.06 17.16
82.96 512.5 0.1431 0.1851 0.1922 0.08449 0.2772 0.08756 ;
1 14.42 19.77 94.48 642.5 0.09752 0.1141 0.09388
0.05839 0.1879 0.0639 0.2895 1.851 2.376 26.85 0.008005
0.02895 0.03321 0.01424 0.01462 0.004452 16.33 30.86
109.5 826.4 0.1431 0.3026 0.3194 0.1565 0.2718 0.09353 ;
1 13.61 24.98 88.05 582.7 0.09488 0.08511 0.08625
0.04489 0.1609 0.05871 0.4565 1.29 2.861 43.14 0.005872
0.01488 0.02647 0.009921 0.01465 0.002355 16.99 35.27
108.6 906.5 0.1265 0.1943 0.3169 0.1184 0.2651 0.07397 ;
];
x=A(:,2:end);
% x=x';
y=A(:, 1);
y=y(:);
% size(x)

29
% size(y)
[y_pred,w]=ntk(x,y,200);
accuracy = sum(y_pred == y) / numel(y) * 100;
fprintf('The Accuracy is:%f \n',accuracy)

The Accuracy is:100.000000

function [y_pred,w] = ntk(x, y, map_dim)

it=1;
max_it=50;

[m, n] = size(x);
B = randn(n, map_dim);
A = randn(1, map_dim);

while it<=max_it

% disp('the iteration is:')


% disp(it)

mid_output = x * B;
k=cos(mid_output+pi/4);
output_0 = k * A';

k1 = [];
for i = 1:m
t1 = k(i,:);
temp = [];
for j = 1:n
tk=-sin(mid_output(i,:)+pi/4);

t2 = tk .* A * x(i,j);
temp = [temp, t2];
end
k1 = [k1; [t1, temp]];
end

w1=pinv(k1)*(y-output_0);
t=[];

for i=1:n
temp2=B(i,:);
t=[t,temp2];
end

w0=[A,t]';
w=w0+w1;

[m1,n1]=size(w);
A1=w(1:map_dim);
B1 = [];

30
end_ind=0;

for i = 2:m1
if end_ind~=m1;
start_ind = (i-1) * map_dim + 1;
end_ind = i * map_dim;
temp3=w(start_ind:end_ind)';
B1=[B1;temp3];
end

end

mid_output=x*B1;
k=cos(mid_output+pi/4);
y_pred=round(k*A1);

A=A1';
B=B1;
it=it+1;
accuracy = sum(y_pred == y) / numel(y) * 100;
% disp('the accuracy is:')
% disp(accuracy)

if accuracy==100
break
end

end
end

Once we the predicted value for new data vector x is given by :

in case the kernel is made of degree d polynomial kernel.

31

You might also like