0% found this document useful (0 votes)
22 views5 pages

AI Lab9

This lab report describes experiments with linear regression. Linear regression is used to predict a continuous output variable based on one or more predictor variables. The lab activities involved coding linear regression from scratch in Python and using Scikit-Learn to fit a linear regression model to the Boston housing dataset. Key outputs included the regression coefficients, variance score, and a plot of residual errors on training and test data. The critical analysis discusses that linear regression assumes a linear relationship between variables and is commonly used for predictive analysis of continuous variables like sales, salary, age, and price.

Uploaded by

Maryam Khansa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views5 pages

AI Lab9

This lab report describes experiments with linear regression. Linear regression is used to predict a continuous output variable based on one or more predictor variables. The lab activities involved coding linear regression from scratch in Python and using Scikit-Learn to fit a linear regression model to the Boston housing dataset. Key outputs included the regression coefficients, variance score, and a plot of residual errors on training and test data. The critical analysis discusses that linear regression assumes a linear relationship between variables and is commonly used for predictive analysis of continuous variables like sales, salary, age, and price.

Uploaded by

Maryam Khansa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

CSC462-Artificial Intelligence

Lab Report 09-Linear Regression

Name MUSKA IJAZ

Registration FA19-BEE-163
Number
Class/Section
BCE-7B

Instructor’s Name Sir ALI RAZA SHAHID

Lab Assessment

Pre-Lab /1

In Lab /5

Critical Analysis /4 /10


In-Lab Activities:
Activity 1:

Code:

import numpy as np
import matplotlib.pyplot as plt

def estimate_coef(x, y):


# number of observations/points
n = np.size(x)

# mean of x and y vector


m_x = np.mean(x)
m_y = np.mean(y)

# calculating cross-deviation and deviation about x


SS_xy = np.sum(y*x) - n*m_y*m_x
SS_xx = np.sum(x*x) - n*m_x*m_x

# calculating regression coefficients


b_1 = SS_xy / SS_xx
b_0 = m_y - b_1*m_x

return (b_0, b_1)

def plot_regression_line(x, y, b):


# plotting the actual points as scatter plot
plt.scatter(x, y, color = "m",
marker = "o", s = 30)

# predicted response vector


y_pred = b[0] + b[1]*x

# plotting the regression line


plt.plot(x, y_pred, color = "b")

# putting labels
plt.xlabel('x')
plt.ylabel('y')
plt.title('Regression line')

# function to show plot


plt.show()

def main():
# observations / data
x = np.array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])
y = np.array([1, 3, 2, 5, 7, 8, 8, 9, 10, 12])
# estimating coefficients
b = estimate_coef(x, y)
print("Estimated coefficients:\nb_0 = {} \
\nb_1 = {}".format(b[0], b[1]))
# plotting regression line
plot_regression_line(x, y, b)

if __name__ == "__main__":
main()
Graph:

Output:

Activity 2:
Code:
import numpy as np
import matplotlib.pyplot as plt
from sklearn import datasets, linear_model, metrics

# load the boston dataset


boston = datasets.load_boston(return_X_y=False)
# defining feature matrix(X) and response vector(y)

X = boston.data
y = boston.target
# splitting X and y into training and testing sets
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.4,random_state=1)

# create linear regression object


reg = linear_model.LinearRegression()

# train the model using the training sets


reg.fit(X_train, y_train)

# regression coefficients
print('Coefficients: ', reg.coef_)

# variance score: 1 means perfect prediction


print('Variance score: {}'.format(reg.score(X_test, y_test)))

# plot for residual error

## setting plot style


plt.style.use('fivethirtyeight')

## plotting residual errors in training data


plt.scatter(reg.predict(X_train), reg.predict(X_train) - y_train,color = "green", s = 10, label = 'Train data')

## plotting residual errors in test data


plt.scatter(reg.predict(X_test), reg.predict(X_test) - y_test,color = "blue", s = 10, label = 'Test data')

## plotting line for zero residual error


plt.hlines(y = 0, xmin = 0, xmax = 50, linewidth = 2)

## plotting legend
plt.legend(loc = 'upper right')

## plot title
plt.title("Residual errors")

## method call for showing the plot


plt.show()

Graph:
Output:

Critical Analysis:
In this lab of linear regression, we come to know that Regression analysis consists
of a set of machine learning methods that allow us to predict a continuous
outcome variable (y) based on the value of one or multiple predictor variables (x).
It assumes a linear relationship between the outcome and the predictor variables.
Linear regression is one of the easiest and most popular Machine Learning
algorithms. It is a statistical method that is used for predictive analysis. Linear
regression makes predictions for continuous/real or numeric variables such as
sales, salary, age, product price, etc. Moreover, the steps involved during lab of
linear regressions are:
1. Initialize the parameters.
2. Predict the value of a dependent variable by given an independent variable.
3. Calculate the error in prediction for all data points.
4. Calculate partial derivative w.r.t a0 and a1.
5. Calculate the cost for each number and add them.

You might also like