Lab 6 - Linear Regression and Multiple Linear Regression
Lab 6 - Linear Regression and Multiple Linear Regression
In order to provide a basic understanding of linear regression, we start with the most
basic version of linear regression, i.e. Simple linear regression.
Simple linear regression is an approach for predicting a response using a single feature.
It is assumed that the two variables are linearly related. Hence, we try to find a linear
function that predicts the response value(y) as accurately as possible as a function of the
feature or independent variable(x).
Let us consider a dataset where we have a value of response y for every feature x:
# putting labels
plt.xlabel('x')
plt.ylabel('y')
def main():
# observations / data
x = np.array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])
y = np.array([1, 3, 2, 5, 7, 8, 8, 9, 10, 12])
# estimating coefficients
b = estimate_coef(x, y)
print("Estimated coefficients:\nb_0 = {} \
\nb_1 = {}".format(b[0], b[1]))
if __name__ == "__main__":
main()
3) Click Run>Run Module and observe the following output and model graph
Output:
Estimated coefficients:
b_0 = -0.0586206896552
b_1 = 1.45747126437
And graph obtained looks like this:
4) Change X and Y with different values and run the LRM.py file and note down the output in
observation
5) Use the diabetes data set from UCI and Pima Indians Diabetes data set for performing Linear
Regression modeling and note down the steps and outputs in your observation.
Multiple linear regression
Code: Python implementation of multiple linear regression techniques on
the Boston house pricing dataset using Scikit-learn.
import matplotlib.pyplot as plt
import numpy as np
from sklearn import datasets, linear_model, metrics
random_state=1)
# regression coefficients
print('Coefficients: ', reg.coef_)
## plotting legend
plt.legend(loc = 'upper right')
## plot title
plt.title("Residual errors")
Output:
Coefficients:
[ -8.80740828e-02 6.72507352e-02 5.10280463e-02 2.18879172e+00
-1.72283734e+01 3.62985243e+00 2.13933641e-03 -1.36531300e+00
2.88788067e-01 -1.22618657e-02 -8.36014969e-01 9.53058061e-03
-5.05036163e-01]
Variance score: 0.720898784611
We define:
explained_variance_score = 1 – Var{y – y’}/Var{y}
Assumptions:
Given below are the basic assumptions that a linear regression model makes
regarding a dataset on which it is applied:
Linear relationship: Relationship between response and feature variables
should be linear. The linearity assumption can be tested using scatter plots.
As shown below, 1st figure represents linearly related variables whereas
variables in the 2nd and 3rd figures are most likely non-linear. So, 1st figure
will give better predictions using linear regression.
Little or no multi-collinearity: It is assumed that there is little or no
multicollinearity in the data. Multicollinearity occurs when the features (or
independent variables) are not independent of each other.
Little or no auto-correlation: Another assumption is that there is little or no
autocorrelation in the data. Autocorrelation occurs when the residual errors
are not independent of each other.
Homoscedasticity: Homoscedasticity describes a situation in which the
error term (that is, the “noise” or random disturbance in the relationship
between the independent variables and the dependent variable) is the same
across all values of the independent variables. As shown below, figure 1 has
homoscedasticity while figure 2 has heteroscedasticity.
Applications:
Trend lines: A trend line represents the variation in quantitative data with
the passage of time (like GDP, oil prices, etc.). These trends usually follow a
linear relationship. Hence, linear regression can be applied to predict future
values. However, this method suffers from a lack of scientific validity in
cases where other potential changes can affect the data.
Finance: The capital price asset model uses linear regression to analyze
and quantify the systematic risks of an investment.
https://www.geeksforgeeks.org/linear-regression-python-implementation/
https://en.wikipedia.org/wiki/Linear_regression
https://en.wikipedia.org/wiki/Simple_linear_regression
http://scikit-learn.org/stable/auto_examples/linear_model/plot_ols.html
http://www.statisticssolutions.com/assumptions-of-linear-regression/