0% found this document useful (0 votes)
48 views

Linear Regression - Ipynb - Colab

Uploaded by

Moonlight
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views

Linear Regression - Ipynb - Colab

Uploaded by

Moonlight
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

9/24/24, 2:17 PM Linear Regression.

ipynb - Colab

# Master the procedure for implementing linear regression

# prepare a data
# randomly set 10 pieces of data with linear relationship
import numpy as np
import matplotlib.pyplot as plt

# data , convert it to array


x = [3, 21, 22, 34, 54, 34, 55, 67, 89, 99]
y = [1, 10, 14, 34, 44, 36, 22, 67, 79, 90]
x = np.array(x)
y = np.array(y)
plt.scatter(x,y)

<matplotlib.collections.PathCollection at 0x7ad4ebcef640>

#define functions
#model function - linear regression model mx+b
def model(a,b,x):
return a*x+b
#loss function - MSE
def loss_function(a,b,x,y):
num = len(x)
predict = model(a,b,x)
return(0.5/num)*(np.square(predict-y)).sum()
#optimization function - calculates the partial derivatives of m and b
#by using the gradient descent
def optimize(a,b,x,y):
num = len(x)
predict = model(a,b,x)
da = (1.0/num)*((predict-y)*x).sum()
db = (1.0/num)*((predict-y).sum())
a = a - Lr*da
b = b - Lr*db
return a,b

def iterate(a,b,x,y,frequency):
for i in range(frequency):
a,b = optimize(a,b,x,y)
return a,b

# start the iteration


a = np.random.rand(1)
b = np.random.rand(1)
Lr = 1e-4 #(0.0001)
a,b = iterate(a,b,x,y,1)
prediction = model(a,b,x)
loss = loss_function(a,b,x,y)
print (a,b,loss)
plt.scatter(x,y)
plt.plot(x,prediction)

https://colab.research.google.com/drive/1p79l-G_imy4AzFx4ePZ_JsoXRvE9qZm3?authuser=1#scrollTo=y0gJsrVflxh3&printMode=true 1/3
9/24/24, 2:17 PM Linear Regression.ipynb - Colab

[0.83093202] [0.97834249] 47.76135709891103


[<matplotlib.lines.Line2D at 0x7ad4dfd09840>]

Start coding or generate with AI.

# perform the 2nd iteration


a,b = iterate(a,b,x,y,2)
prediction = model(a,b,x)
loss = loss_function(a,b,x,y)
print (a,b,loss)
plt.scatter(x,y)
plt.plot(x,prediction)

[0.83699793] [0.97812593] 47.600622952514215


[<matplotlib.lines.Line2D at 0x7ad4dfd29510>]

# perform the third iteration


a,b = iterate(a,b,x,y,3)
prediction = model(a,b,x)
loss = loss_function(a,b,x,y)
print (a,b,loss)
plt.scatter(x,y)
plt.plot(x,prediction)

[0.84471736] [0.82496964] 47.32157003401949


[<matplotlib.lines.Line2D at 0x7ad4dfad3370>]

add Code add Text

keyboard_arrow_down Q1: Must the loss value return to zero when the raw data is modified?

https://colab.research.google.com/drive/1p79l-G_imy4AzFx4ePZ_JsoXRvE9qZm3?authuser=1#scrollTo=y0gJsrVflxh3&printMode=true 2/3
9/24/24, 2:17 PM Linear Regression.ipynb - Colab

A: No, modifying raw data will not guarantee that the loss will return to zero.

Start coding or generate with AI.

https://colab.research.google.com/drive/1p79l-G_imy4AzFx4ePZ_JsoXRvE9qZm3?authuser=1#scrollTo=y0gJsrVflxh3&printMode=true 3/3

You might also like