0% found this document useful (0 votes)
26 views9 pages

Assignment 3

The document contains 5 problems related to regression analysis and interpolation. Problem 1 involves polynomial regression to find the coefficient of a high degree polynomial. Problem 2 involves Lagrange polynomial interpolation for different values of N. Problem 3 involves linear regression to find the slope and intercept of a best fit line. Problem 4 compares linear and polynomial regression on some data. Problem 5 involves multiple linear regression on data with 2 predictors.

Uploaded by

Ramima Mumtarin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views9 pages

Assignment 3

The document contains 5 problems related to regression analysis and interpolation. Problem 1 involves polynomial regression to find the coefficient of a high degree polynomial. Problem 2 involves Lagrange polynomial interpolation for different values of N. Problem 3 involves linear regression to find the slope and intercept of a best fit line. Problem 4 compares linear and polynomial regression on some data. Problem 5 involves multiple linear regression on data with 2 predictors.

Uploaded by

Ramima Mumtarin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Problem 1

clc; clear all; close all;

coeff=[-4 0 3 -2 1];

power=50;
result=1;
for i=1:power
result=conv(result,coeff);
end
coeffocient=result(end-power);
disp(coeffocient);

Problem 2a
clc; clear all; close all;

N=[2 3 4 5 6];
figure;

for i=1:length(N)
n=N(i);
x=linspace(-.5,.5,n);
y=sin(pi*x);
F = 0;
for j=1:n
p=1;
for k=1:n
if(j~=k)
temp = [1,-x(k)]/(x(j)-x(k));
p=conv(p,temp);
end
end
F=F+p*y(j);
end
x_1=linspace(-.5,.5,1000);
y_test=polyval(F,x_1);
y_real=sin(pi*x_1);
E=y_real - y_test;
y1 = polyval(F, x);
plot(x,y1,'DisplayName', ['N = '
num2str(n)],'LineWidth',2);
hold on;
title('Lagrange Polynomial Interpolation for sin(\pi
x)');
xlabel('x');
ylabel('y');
legend('show');
grid on;

end
plot(x_1,y_real,'LineWidth',2)
Problem 2b
clc; clear all; close all;

N=[2 3 4 5 6];
figure;

for i=1:length(N)
n=N(i);
x=linspace(-.5,.5,n);
y=sin(pi*x);
F = 0;
for j=1:n
p=1;
for k=1:n
if(j~=k)
temp = [1,-x(k)]/(x(j)-x(k));
p=conv(p,temp);
end
end
F=F+p*y(j);
end
x_1=linspace(-.5,.5,1000);
y_test=polyval(F,x_1);
y_real=sin(pi*x_1);
E=y_real - y_test;y1 = polyval(F, x);
subplot(2,3,i);
plot(x_1,E,'DisplayName', ['N = '
num2str(n)],'LineWidth',2);
hold on;
title('Lagrange Polynomial Interpolation for sin(\pi
x)');
xlabel('x');
ylabel('y');
legend('show');
grid on;
end

Explanation
The error for N=2,3 is higher compared to the other four values. This is because,
for N=2,3, the regression is conducted between only two points, leading to a less
accurate approximation and higher errors. As we increase the number of points,
the interpolation becomes more refined, resulting in a more accurate
approximation with reduced errors.

Problem 3
clc; clear all; close all;
x=[1 3 4 7 8 10];
y=[2.1 4.6 5.4 6.1 6.4 6.6];

% Linearisation
y1=1./y;
x1=1./x;
% Matrix Declaration
coeff=[length(x1),sum(x1);sum(x1),sum(x1.^2)];
Value=[sum(y1);sum(x1.*y1)];
solution=inv(coeff)*Value;

% Plotting Points
plot(x,y,'ok','LineWidth',2);
hold on;
a0=solution(1);
a1=solution(2);
fx=a0+a1*x1;
plot(x,1./fx,'LineWidth',2);
xlabel('x');
ylabel('y');
title('best fitted','FontSize',15);
legend('Points','Best fitted line');
m=1/a0
b=a1*m
mdl_linear=fitlm(x1, y1);

% R-squared for linear regression


rsquared=mdl_linear.Rsquared.Ordinary
Problem 4

clc; clear all; close all;


x=[0 .05 .2 .5 1 2];
y=[15 14.998 14.983 14.895 14.58 13.32];
plot(x,y,'*');
hold on;
coefficients=polyfit(x, y, 2);
mdl_poly = fitlm(x,y,'poly2');
x_values=linspace(0,10,1000);

% x values
y_predicted_poly=polyval(coefficients, x_values);
plot(x_values, y_predicted_poly, 'r-', 'LineWidth', 2,
'DisplayName', 'Polynomial Fit');
ylim([0,15]);
hold off;

% R-squared -> polynomial regression


rsquared_poly=mdl_poly.Rsquared.Ordinary;
disp(['R-squared Polynomial regression: ',
num2str(rsquared_poly)]);

mdl_linear=fitlm(x, y);
plot(mdl_linear,'LineWidth',2)

% R-squared -> linear regression


rsquared=mdl_linear.Rsquared.Ordinary;
disp(['R-squared Linear regression: ', num2str(rsquared)]);
if rsquared_poly>rsquared
fprintf('Polynomial Regression is more accurate');
else
fprintf('Linear Regression is more accurate');
end
Problem 5
clc; clear all; close all;
insert=load("mlr_data_new.mat");
x1=insert.x1;
x2=insert.x2;
y=insert.y;
n=length(x1);
coeff=[n sum(x1) sum(x2);...
sum(x1) sum(x1.^2) sum(x2.*x1);...
sum(x2) sum(x1.*x2) sum(x2.^2)];
rhs=[sum(y);sum(y.*x1);sum(y.*x2)];
solution=inv(coeff)*rhs;
a0=solution(1);
a1=solution(2);
a2=solution(3);
disp(a0);
disp(a1);
disp(a2);

You might also like