Note
Go to the end to download the full example code.
Warm-up: numpy#
Created On: Dec 03, 2020 | Last Updated: Dec 03, 2020 | Last Verified: Nov 05, 2024
A third order polynomial, trained to predict \(y=\sin(x)\) from \(-\pi\) to \(pi\) by minimizing squared Euclidean distance.
This implementation uses numpy to manually compute the forward pass, loss, and backward pass.
A numpy array is a generic n-dimensional array; it does not know anything about deep learning or gradients or computational graphs, and is just a way to perform generic numeric computations.
99 6054.871377121474
199 4020.3967849715546
299 2671.1167838104184
399 1776.08631809217
499 1182.252531183949
599 788.1682671619614
699 526.5810854108254
799 352.8996507635834
899 237.5525597300654
999 160.92530109178185
1099 110.00502306756633
1199 76.15672124456957
1299 53.649088349442785
1399 38.67714634191229
1499 28.714131880245464
1599 22.08163559676781
1699 17.66444030502923
1799 14.721309288008003
1899 12.75941116335816
1999 11.450958594262236
Result: y = 0.021470759248094426 + 0.8109150680906025 x + -0.003704063245272326 x^2 + -0.0868120716036007 x^3
import numpy as np
import math
# Create random input and output data
x = np.linspace(-math.pi, math.pi, 2000)
y = np.sin(x)
# Randomly initialize weights
a = np.random.randn()
b = np.random.randn()
c = np.random.randn()
d = np.random.randn()
learning_rate = 1e-6
for t in range(2000):
# Forward pass: compute predicted y
# y = a + b x + c x^2 + d x^3
y_pred = a + b * x + c * x ** 2 + d * x ** 3
# Compute and print loss
loss = np.square(y_pred - y).sum()
if t % 100 == 99:
print(t, loss)
# Backprop to compute gradients of a, b, c, d with respect to loss
grad_y_pred = 2.0 * (y_pred - y)
grad_a = grad_y_pred.sum()
grad_b = (grad_y_pred * x).sum()
grad_c = (grad_y_pred * x ** 2).sum()
grad_d = (grad_y_pred * x ** 3).sum()
# Update weights
a -= learning_rate * grad_a
b -= learning_rate * grad_b
c -= learning_rate * grad_c
d -= learning_rate * grad_d
print(f'Result: y = {a} + {b} x + {c} x^2 + {d} x^3')
Total running time of the script: (0 minutes 0.235 seconds)