Note
Go to the end to download the full example code.
Warm-up: numpy#
Created On: Dec 03, 2020 | Last Updated: Sep 29, 2025 | Last Verified: Nov 05, 2024
A third order polynomial, trained to predict \(y=\sin(x)\) from \(-\pi\) to \(\pi\) by minimizing squared Euclidean distance.
This implementation uses numpy to manually compute the forward pass, loss, and backward pass.
A numpy array is a generic n-dimensional array; it does not know anything about deep learning or gradients or computational graphs, and is just a way to perform generic numeric computations.
99 2972.8365442382424
199 2027.6008473509378
299 1385.3669146638174
399 948.5196470120814
499 651.0421860359312
599 448.24075324206933
699 309.8255351224149
799 215.24652218377466
899 150.54642477560583
999 106.23514681033417
1099 75.852778991015
1199 54.99708521169971
1299 40.66465230735783
1399 30.804061640724562
1499 24.01252378964626
1599 19.329687751156655
1699 16.097341150423617
1799 13.863837434678592
1899 12.318917228711637
1999 11.249207003926937
Result: y = -0.0446162580431602 + 0.8318557176242146 x + 0.007697046930183567 x^2 + -0.08979069664697652 x^3
import numpy as np
import math
# Create random input and output data
x = np.linspace(-math.pi, math.pi, 2000)
y = np.sin(x)
# Randomly initialize weights
a = np.random.randn()
b = np.random.randn()
c = np.random.randn()
d = np.random.randn()
learning_rate = 1e-6
for t in range(2000):
# Forward pass: compute predicted y
# y = a + b x + c x^2 + d x^3
y_pred = a + b * x + c * x ** 2 + d * x ** 3
# Compute and print loss
loss = np.square(y_pred - y).sum()
if t % 100 == 99:
print(t, loss)
# Backprop to compute gradients of a, b, c, d with respect to loss
grad_y_pred = 2.0 * (y_pred - y)
grad_a = grad_y_pred.sum()
grad_b = (grad_y_pred * x).sum()
grad_c = (grad_y_pred * x ** 2).sum()
grad_d = (grad_y_pred * x ** 3).sum()
# Update weights
a -= learning_rate * grad_a
b -= learning_rate * grad_b
c -= learning_rate * grad_c
d -= learning_rate * grad_d
print(f'Result: y = {a} + {b} x + {c} x^2 + {d} x^3')
Total running time of the script: (0 minutes 0.236 seconds)