0% found this document useful (0 votes)
7 views1 page

Lin Reg

The document describes a MATLAB function that performs basic linear regression on generated noisy linear data. It explains the theory behind linear regression, including the normal equation used to find the best-fitting line, and provides a step-by-step implementation to plot the original data points alongside the fitted regression line. The function aims to minimize the sum of squared errors between predicted and actual y-values.

Uploaded by

copeyic220
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views1 page

Lin Reg

The document describes a MATLAB function that performs basic linear regression on generated noisy linear data. It explains the theory behind linear regression, including the normal equation used to find the best-fitting line, and provides a step-by-step implementation to plot the original data points alongside the fitted regression line. The function aims to minimize the sum of squared errors between predicted and actual y-values.

Uploaded by

copeyic220
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

function basic_linear_regression() % basic_linear_regression() performs a basic linear regression on sample data.

% %
What it does: This function generates some noisy linear data, then uses the % normal equation to find the best-fitting
line (slope and intercept) that % minimizes the sum of squared errors between the predicted and actual y-values. %
Finally, it plots the original data points and the fitted regression line. % % Theory: Linear regression aims to model the
relationship between a dependent % variable (y) and one or more independent variables (x) by fitting a linear %
equation to the observed data. In the case of simple linear regression with % one independent variable, the model is: %
% y = mx + c + epsilon % % where: % y is the dependent variable. % x is the independent variable. % m is the slope of
the line. % c is the y-intercept. % epsilon is the error term (representing noise). % % The goal is to find the values of 'm'
and 'c' that best fit the data. One common % method to achieve this is by minimizing the sum of squared errors (SSE).
The % normal equation provides a closed-form solution for 'm' and 'c': % % [c; m] = (X' * X)^(-1) * X' * y % % where: %
y is a column vector of the observed y-values. % X is a design matrix where the first column is all ones (for the
intercept) % and the second column contains the observed x-values. % X' is the transpose of X. % ^(-1) denotes the
matrix inverse.

% Generate some sample noisy linear data


num_points = 50;
x = linspace(0, 10, num_points)'; % Independent variable (column vector)
true_slope = 2;
true_intercept = 1;
noise = 2 * randn(num_points, 1); % Add some random noise
y = true_slope * x + true_intercept + noise; % Dependent variable

% Construct the design matrix X


X = [ones(num_points, 1), x]; % Add a column of ones for the intercept

% Solve the normal equation to find the regression coefficients


coefficients = (X' * X) \ (X' * y); % Equivalent to inv(X' * X) * X' * y
intercept = coefficients(1);
slope = coefficients(2);

% Generate the predicted y-values using the fitted line


y_predicted = slope * x + intercept;

% Plot the original data and the regression line


figure;
scatter(x, y, 'b.'); % Plot the data points as blue dots
hold on;
plot(x, y_predicted, 'r-'); % Plot the regression line in red
hold off;
xlabel('x');
ylabel('y');
title('Basic Linear Regression');
legend('Original Data', 'Fitted Line');
grid on;

end

You might also like