Lecture 02 Linear Model
Lecture 02 Linear Model
Lecturer : Hongpu Liu Lecture 2-1 PyTorch Tutorial @ SLAM Research Group
Machine learning
x (hours) y (points)
1 2
2 4
3 6
4 ?
Lecturer : Hongpu Liu Lecture 2-2 PyTorch Tutorial @ SLAM Research Group
Machine Learning
4 hours ? points
Prediction
x (hours) y (points)
1 2
2 4
3 6
4 ?
Lecturer : Hongpu Liu Lecture 2-3 PyTorch Tutorial @ SLAM Research Group
Machine Learning
4 hours ? points
Prediction
x (hours) y (points)
1 2
2 4 Training Set
Supervised
3 6 Learning
4 ? Test Set
Lecturer : Hongpu Liu Lecture 2-4 PyTorch Tutorial @ SLAM Research Group
Model design
x (hours) y (points)
1 2 Linear Model
2 4
3 6 𝑦ො = 𝑥 ∗ 𝜔 + 𝑏
4 ?
Lecturer : Hongpu Liu Lecture 2-5 PyTorch Tutorial @ SLAM Research Group
Model design
x (hours) y (points)
1 2 Linear Model
2 4
3 6 𝑦ො = 𝑥 ∗ 𝜔
4 ?
To simplify the model
Lecturer : Hongpu Liu Lecture 2-6 PyTorch Tutorial @ SLAM Research Group
Linear Regression
Linear Model 14
12
𝑦ො = 𝑥 ∗ 𝜔 10 True Line
8
Points
6
x (hours) y (points)
4
1 2
2
2 4
0
3 6
0 1 2 3 4 5
Hours
Lecturer : Hongpu Liu Lecture 2-7 PyTorch Tutorial @ SLAM Research Group
Linear Regression
12
𝑦ො = 𝑥 ∗ 𝜔 10 True Line
8
Points
6
x (hours) y (points)
4
1 2
2
2 4
0
3 6
0 1 2 3 4 5
Hours
Lecturer : Hongpu Liu Lecture 2-8 PyTorch Tutorial @ SLAM Research Group
Compute Loss
1 2 3 1
2 4 6 4
3 6 9 9
mean = 14/3
Lecturer : Hongpu Liu Lecture 2-9 PyTorch Tutorial @ SLAM Research Group
Compute Loss
1 2 4 4
2 4 8 16
3 6 12 36
mean = 56/3
Lecturer : Hongpu Liu Lecture 2-10 PyTorch Tutorial @ SLAM Research Group
Compute Loss
1 2 0 4
2 4 0 16
3 6 0 36
mean = 56/3
Lecturer : Hongpu Liu Lecture 2-11 PyTorch Tutorial @ SLAM Research Group
Compute Loss
1 2 1 1
2 4 2 4
3 6 3 9
mean = 14/3
Lecturer : Hongpu Liu Lecture 2-12 PyTorch Tutorial @ SLAM Research Group
Compute Loss
1 2 2 0
2 4 4 0
3 6 6 0
mean = 0
Lecturer : Hongpu Liu Lecture 2-13 PyTorch Tutorial @ SLAM Research Group
Loss function & Cost function
Lecturer : Hongpu Liu Lecture 2-14 PyTorch Tutorial @ SLAM Research Group
Compute Cost
x (Hours) Loss (w=0) Loss (w=1) Loss (w=2) Loss (w=3) Loss (w=4)
1 4 1 0 1 4
2 16 4 0 4 16
3 36 9 0 9 36
Lecturer : Hongpu Liu Lecture 2-15 PyTorch Tutorial @ SLAM Research Group
Linear Regression
Lecturer : Hongpu Liu Lecture 2-16 PyTorch Tutorial @ SLAM Research Group
How to draw the graph
w_list = []
mse_list = []
for w in np.arange(0.0, 4.1, 0.1):
print('w=', w)
l_sum = 0
for x_val, y_val in zip(x_data, y_data):
y_pred_val = forward(x_val)
loss_val = loss(x_val, y_val)
l_sum += loss_val
print('\t', x_val, y_val, y_pred_val, loss_val)
print('MSE=', l_sum / 3)
w_list.append(w)
mse_list.append(l_sum / 3)
Lecturer : Hongpu Liu Lecture 2-17 PyTorch Tutorial @ SLAM Research Group
How to draw the graph
w_list = []
mse_list = []
for w in np.arange(0.0, 4.1, 0.1):
print('w=', w)
l_sum = 0
for x_val, y_val in zip(x_data, y_data):
y_pred_val = forward(x_val)
loss_val = loss(x_val, y_val)
l_sum += loss_val
print('\t', x_val, y_val, y_pred_val, loss_val)
print('MSE=', l_sum / 3)
w_list.append(w)
mse_list.append(l_sum / 3)
Lecturer : Hongpu Liu Lecture 2-18 PyTorch Tutorial @ SLAM Research Group
How to draw the graph
Lecturer : Hongpu Liu Lecture 2-19 PyTorch Tutorial @ SLAM Research Group
How to draw the graph
def forward(x):
return x * w
Define the loss function:
def loss(x, y):
y_pred = forward(x)
return (y_pred - y) * (y_pred - y)
Loss Function
w_list = []
mse_list = []
for w in np.arange(0.0, 4.1, 0.1):
print('w=', w)
l_sum = 0
𝑙𝑜𝑠𝑠 = (𝑦ො − 𝑦)2 = (𝑥 ∗ 𝜔 − 𝑦)2
for x_val, y_val in zip(x_data, y_data):
y_pred_val = forward(x_val)
loss_val = loss(x_val, y_val)
l_sum += loss_val
print('\t', x_val, y_val, y_pred_val, loss_val)
print('MSE=', l_sum / 3)
w_list.append(w)
mse_list.append(l_sum / 3)
Lecturer : Hongpu Liu Lecture 2-20 PyTorch Tutorial @ SLAM Research Group
How to draw the graph
def forward(x):
return x * w
List w_list save the weights 𝝎.
def loss(x, y):
List mse_list save the cost values of each 𝝎.
y_pred = forward(x)
return (y_pred - y) * (y_pred - y)
w_list = []
mse_list = []
for w in np.arange(0.0, 4.1, 0.1):
print('w=', w)
l_sum = 0
for x_val, y_val in zip(x_data, y_data):
y_pred_val = forward(x_val)
loss_val = loss(x_val, y_val)
l_sum += loss_val
print('\t', x_val, y_val, y_pred_val, loss_val)
print('MSE=', l_sum / 3)
w_list.append(w)
mse_list.append(l_sum / 3)
Lecturer : Hongpu Liu Lecture 2-21 PyTorch Tutorial @ SLAM Research Group
How to draw the graph
def forward(x):
return x * w
Compute cost value at [0.0, 0.1, 0.2, … , 4.0]
def loss(x, y):
y_pred = forward(x)
return (y_pred - y) * (y_pred - y)
w_list = []
mse_list = []
for w in np.arange(0.0, 4.1, 0.1):
print('w=', w)
l_sum = 0
for x_val, y_val in zip(x_data, y_data):
y_pred_val = forward(x_val)
loss_val = loss(x_val, y_val)
l_sum += loss_val
print('\t', x_val, y_val, y_pred_val, loss_val)
print('MSE=', l_sum / 3)
w_list.append(w)
mse_list.append(l_sum / 3)
Lecturer : Hongpu Liu Lecture 2-22 PyTorch Tutorial @ SLAM Research Group
How to draw the graph
Lecturer : Hongpu Liu Lecture 2-23 PyTorch Tutorial @ SLAM Research Group
How to draw the graph
w_list = []
mse_list = []
for w in np.arange(0.0, 4.1, 0.1):
print('w=', w)
l_sum = 0
for x_val, y_val in zip(x_data, y_data):
y_pred_val = forward(x_val)
loss_val = loss(x_val, y_val)
l_sum += loss_val
print('\t', x_val, y_val, y_pred_val, loss_val)
print('MSE=', l_sum / 3)
w_list.append(w)
mse_list.append(l_sum / 3)
Lecturer : Hongpu Liu Lecture 2-24 PyTorch Tutorial @ SLAM Research Group
How to draw the graph
w_list = []
mse_list = []
for w in np.arange(0.0, 4.1, 0.1):
print('w=', w)
l_sum = 0
for x_val, y_val in zip(x_data, y_data):
y_pred_val = forward(x_val)
loss_val = loss(x_val, y_val)
l_sum += loss_val
print('\t', x_val, y_val, y_pred_val, loss_val)
print('MSE=', l_sum / 3)
w_list.append(w)
mse_list.append(l_sum / 3)
Lecturer : Hongpu Liu Lecture 2-25 PyTorch Tutorial @ SLAM Research Group
Exercise
Lecturer : Hongpu Liu Lecture 2-26 PyTorch Tutorial @ SLAM Research Group
PyTorch Tutorial
02. Linear Model
Lecturer : Hongpu Liu Lecture 2-27 PyTorch Tutorial @ SLAM Research Group