0% found this document useful (0 votes)
41 views4 pages

Linear Regression

Linear regression is used to predict a dependent variable based on an independent variable. The document outlines the steps of linear regression which include: generating sample data, defining a linear function, computing the cost function using mean squared error, using gradient descent to optimize slope and intercept values, training the model through iterations, calculating R^2 to evaluate fit, plotting results on a scatter plot with regression line, and printing the R^2 value. The document provides an example Python program implementing linear regression.

Uploaded by

Muhammad Ismail
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views4 pages

Linear Regression

Linear regression is used to predict a dependent variable based on an independent variable. The document outlines the steps of linear regression which include: generating sample data, defining a linear function, computing the cost function using mean squared error, using gradient descent to optimize slope and intercept values, training the model through iterations, calculating R^2 to evaluate fit, plotting results on a scatter plot with regression line, and printing the R^2 value. The document provides an example Python program implementing linear regression.

Uploaded by

Muhammad Ismail
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

NAME : ISMAIL MUHAMMAD GORAIYA

ROLL NO : F2022132029
JOB NO:6
Title : LINEAR REGRESSION METHOD
Introduction:
Linear regression analysis is used to predict the value of a variable based on the value of another
variable. The variable you want to predict is called the dependent variable. The variable you are
using to predict the other variable's value is called the independent variable.

WHY TO USE LINEAR REGRESSION ?


The goal of a simple linear regression is to predict the value of a dependent variable based on an
independent variable. The greater the linear relationship between the independent variable and
the dependent variable, the more accurate is the prediction.
Procedure:
• Generate Sample Data: It creates some random data points that roughly follow a linear
pattern.
• Define the Linear Function: A function predict is created to calculate the predicted
values based on the input parameters.
• Compute the Cost Function: This function compute_cost measures how well the
predicted values match the actual values by using Mean Squared Error. It calculates the
average squared difference between the predicted and actual values.
• Gradient Descent: This is the core algorithm for optimizing the values of slope and
intercept (m and b) to minimize the cost function. It adjusts the slope and intercept
iteratively to reach values that make the predicted values closer to the actual ones.
• Training the Model: It initializes the slope and intercept to zero and uses gradient
descent to update these values based on the training data for a certain number of
iterations using a specified learning rate.
• Calculate R^2 Value: R-squared (R^2) is a measure of how well the regression line fits
the data. It's calculated here to evaluate the goodness of fit of the linear regression model.
• Plotting the Results: It creates a scatter plot of the original data points and overlays a
line representing the linear regression fit based on the calculated slope and intercept.
• Print R^2 Value: Finally, it prints out the R^2 value to indicate how well the linear
regression model fits the data.

Programme:

import numpy as np
import matplotlib.pyplot as plt

#1 . Generate Sample Data


np.random.seed(0)
x = 2 * np.random.randn(100,1)
y = 4 + 2 * x + np.random.randn(100,1)
#2 . Define the Linear Function
def predict(m, b, x):
return m * x + b
#3 . Compute The Cost Function(Mean Square Error)
def compute_cost(m, b, X, Y):
total_cost = np.sum((Y - predict(m, b, x)) ** 2) / len(Y)
return total_cost

#4 . Gradient Descent
def gradient_descent(X, Y, m, b, learning_rate, iterations):
N = len(Y)
for _ in range(iterations):
Y_pred = predict(m, b , X)
dm = -(2/N) * np.sum(X * (Y - Y_pred))
db = -(2/N) * np.sum(Y - Y_pred) b = b - learning_rate * db
return m, b

#5 . Training the Model


m, b = 0, 0 #initial values
iterations = 1000
learning_rate = 0.01

m, b = gradient_descent(x, y, m, b, learning_rate, iterations)

#6 . Calculate R^2 Value


y_pred = predict(m, b, x)
SS_res = np.sum((y - y_pred) ** 2)
SS_tot = np.sum((y - np.mean(y)) ** 2)
r_squared = 1 - (SS_res / SS_tot)

#7 . Plotting the Results


plt.scatter(x, y, color= 'blue')
plt.plot(x, y_pred, color = 'red')

plt.xlabel('X')
plt.ylabel('Y')
plt.title( 'Linear Regression Fit')
plt.show()

#8 . Print R^2 Value


print(' R^2 Value', r_squared)

X values:



Y values:

Y-pred values:

You might also like