0% found this document useful (0 votes)
9 views3 pages

Ex 5

The document outlines a Python program for performing Simple Linear Regression using libraries such as numpy and matplotlib. It details the algorithm steps, sample data, model creation, evaluation metrics, and visualization of results. The program successfully computes and displays the regression coefficients, mean squared error, and R^2 score.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views3 pages

Ex 5

The document outlines a Python program for performing Simple Linear Regression using libraries such as numpy and matplotlib. It details the algorithm steps, sample data, model creation, evaluation metrics, and visualization of results. The program successfully computes and displays the regression coefficients, mean squared error, and R^2 score.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

EX.

NO: 5
SIMPLE LINEAR REGRESSION
DATE:

AIM:
To write a python program for Simple Linear Regression

ALGORITHM:
Step 1: Start the Program
Step 2: Import numpy and matplotlib package

Step 3: Define coefficient function


Step 4: Calculate cross-deviation and deviation about x
Step 5: Calculate regression coefficients
Step 6: Plot the Linear regression and define main function
Step 7: Print the result
Step 8: Stop the process
PROGRAM
# Importing necessary libraries
import numpy as np
import matplotlib.pyplot as plt
from sklearn.linear_model import LinearRegression
from sklearn.model_selection import train_test_split
from sklearn.metrics import mean_squared_error, r2_score

# Sample Data (X: Independent variable, y: Dependent variable)


# Example data: X could be years of experience, and y could be salary
X = np.array([[1], [2], [3], [4], [5], [6], [7], [8], [9], [10]]) # Independent
variable
y = np.array([1, 2, 1.3, 3.75, 3.2, 4.4, 4.7, 5.8, 6.1, 7.3]) # Dependent
variable
# Split the data into training and testing sets (80% for training, 20% for
testing)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2,
random_state=42)
# Creating the model
model = LinearRegression()
# Fitting the model to the training data
model.fit(X_train, y_train)
# Making predictions on the test data
y_pred = model.predict(X_test)
# Print out the coefficients
print("Intercept (b0):", model.intercept_)
print("Coefficient (b1):", model.coef_)
# Evaluate the model
mse = mean_squared_error(y_test, y_pred)
r2 = r2_score(y_test, y_pred)
print("Mean Squared Error:", mse)
print("R^2 Score:", r2)
# Visualizing the results
plt.scatter(X_test, y_test, color='blue', label='Actual data')
plt.plot(X_test, y_pred, color='red', linewidth=2, label='Regression line')
plt.title('Simple Linear Regression')
plt.xlabel('Independent Variable (X)')
plt.ylabel('Dependent Variable (y)')
plt.legend()
plt.show()
OUTPUT
Intercept (b0): -0.01594827586206904
Coefficient (b1): [0.71767241]
Mean Squared Error: 0.22741017018430443
R^2 Score: 0.9458869315444843
RESULT:
Thus the computation for Simple Linear Regression was successfully
completed.

You might also like