Skip to content

Commit f4ed359

Browse files
authored
Micrograd example (pyscript#116)
* added micrograd_ai.html and micrograd_ai.py to examples * added micrograd_ai.html and micrograd_ai.py to examples fix typo
1 parent ac7ee85 commit f4ed359

File tree

2 files changed

+334
-0
lines changed

2 files changed

+334
-0
lines changed

pyscriptjs/examples/micrograd_ai.html

Lines changed: 192 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,192 @@
1+
<!DOCTYPE html>
2+
<html lang="en">
3+
<head>
4+
<meta charset="utf-8" />
5+
<link rel="icon" type="image/x-icon" href="./favicon.png">
6+
7+
<title>micrograd</title>
8+
9+
<link rel="stylesheet" href="../build/pyscript.css" />
10+
<script defer src="../build/pyscript.js"></script>
11+
12+
<py-env>
13+
- micrograd
14+
- numpy
15+
- matplotlib
16+
</py-env>
17+
18+
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.1.3/dist/css/bootstrap.min.css" rel="stylesheet" crossorigin="anonymous">
19+
</head>
20+
21+
<body style="padding-top: 20px; padding-right: 20px; padding-bottom: 20px; padding-left: 20px">
22+
<h1>Micrograd - A tiny Autograd engine (with a bite! :))</h1><br>
23+
<div>
24+
<p>
25+
<a href="https://github.com/karpathy/micrograd">Micrograd</a> is a tiny Autograd engine created
26+
by <a href="https://twitter.com/karpathy">Andrej Karpathy</a>. This app recreates the
27+
<a href="https://github.com/karpathy/micrograd/blob/master/demo.ipynb">demo</a>
28+
he prepared for this package using pyscript to train a basic model, written in Python, natively in
29+
the browser. <br>
30+
</p>
31+
</div>
32+
<div>
33+
<p>
34+
You may run each Python REPL cell interactively by pressing (Shift + Enter) or (Ctrl + Enter).
35+
You can also modify the code directly as you wish. If you want to run all the code at once,
36+
not each cell individually, you may instead click the 'Run All' button. Training the model
37+
takes between 1-2 min if you decide to 'Run All' at once. 'Run All' is your only option if
38+
you are running this on a mobile device where you cannot press (Shift + Enter). After the
39+
model is trained, a plot image should be displayed depicting the model's ability to
40+
classify the data. <br>
41+
</p>
42+
<p>
43+
Currently the <code>&gt;</code> symbol is being imported incorrectly as <code>&ampgt;</code> into the REPL's.
44+
In this app the <code>&gt;</code> symbol has been replaced with <code>().__gt__()</code> so you can run the code
45+
without issue. Ex: intead of <code>a &gt; b</code>, you will see <code>(a).__gt__(b)</code> instead. <br>
46+
</p>
47+
<p>
48+
<py-script>import js; js.document.getElementById('python-status').innerHTML = 'Python is now ready. You may proceed.'</py-script>
49+
<div id="python-status">Python is currently starting. Please wait...</div>
50+
</p>
51+
<p>
52+
<button id="run-all-button" class="btn btn-primary" type="submit" pys-onClick="run_all_micrograd_demo">Run All</button><br>
53+
<py-script src="/micrograd_ai.py"></py-script>
54+
<div id="micrograd-run-all-print-div"></div><br>
55+
<div id="micrograd-run-all-fig1-div"></div>
56+
<div id="micrograd-run-all-fig2-div"></div><br>
57+
</p>
58+
</div>
59+
<py-repl auto-generate="false">
60+
import random
61+
import numpy as np
62+
import matplotlib.pyplot as plt
63+
</py-repl><br>
64+
<py-repl auto-generate="false">
65+
from micrograd.engine import Value
66+
from micrograd.nn import Neuron, Layer, MLP
67+
</py-repl><br>
68+
<py-repl auto-generate="true">
69+
np.random.seed(1337)
70+
random.seed(1337)
71+
</py-repl><br>
72+
<py-repl auto-generate="true">
73+
#An adaptation of sklearn's make_moons function https://scikit-learn.org/stable/modules/generated/sklearn.datasets.make_moons.html
74+
def make_moons(n_samples=100, noise=None):
75+
n_samples_out, n_samples_in = n_samples, n_samples
76+
77+
outer_circ_x = np.cos(np.linspace(0, np.pi, n_samples_out))
78+
outer_circ_y = np.sin(np.linspace(0, np.pi, n_samples_out))
79+
inner_circ_x = 1 - np.cos(np.linspace(0, np.pi, n_samples_in))
80+
inner_circ_y = 1 - np.sin(np.linspace(0, np.pi, n_samples_in)) - 0.5
81+
82+
X = np.vstack([np.append(outer_circ_x, inner_circ_x), np.append(outer_circ_y, inner_circ_y)]).T
83+
y = np.hstack([np.zeros(n_samples_out, dtype=np.intp), np.ones(n_samples_in, dtype=np.intp)])
84+
if noise is not None: X += np.random.normal(loc=0.0, scale=noise, size=X.shape)
85+
return X, y
86+
X, y = make_moons(n_samples=100, noise=0.1)
87+
</py-repl><br>
88+
<py-repl auto-generate="true">
89+
y = y*2 - 1 # make y be -1 or 1
90+
# visualize in 2D
91+
plt.figure(figsize=(5,5))
92+
plt.scatter(X[:,0], X[:,1], c=y, s=20, cmap='jet')
93+
plt
94+
</py-repl><br>
95+
<py-repl auto-generate="true">
96+
model = MLP(2, [16, 16, 1]) # 2-layer neural network
97+
print(model)
98+
print("number of parameters", len(model.parameters()))
99+
</py-repl><br>
100+
101+
<div>
102+
Line 24 has been changed from: <br>
103+
<code>accuracy = [(yi &gt; 0) == (scorei.data &gt; 0) for yi, scorei in zip(yb, scores)]</code><br>
104+
to: <br>
105+
<code>accuracy = [((yi).__gt__(0)) == ((scorei.data).__gt__(0)) for yi, scorei in zip(yb, scores)]</code><br>
106+
</div>
107+
108+
<py-repl auto-generate="true">
109+
# loss function
110+
def loss(batch_size=None):
111+
112+
# inline DataLoader :)
113+
if batch_size is None:
114+
Xb, yb = X, y
115+
else:
116+
ri = np.random.permutation(X.shape[0])[:batch_size]
117+
Xb, yb = X[ri], y[ri]
118+
inputs = [list(map(Value, xrow)) for xrow in Xb]
119+
120+
# forward the model to get scores
121+
scores = list(map(model, inputs))
122+
123+
# svm "max-margin" loss
124+
losses = [(1 + -yi*scorei).relu() for yi, scorei in zip(yb, scores)]
125+
data_loss = sum(losses) * (1.0 / len(losses))
126+
# L2 regularization
127+
alpha = 1e-4
128+
reg_loss = alpha * sum((p*p for p in model.parameters()))
129+
total_loss = data_loss + reg_loss
130+
131+
# also get accuracy
132+
accuracy = [((yi).__gt__(0)) == ((scorei.data).__gt__(0)) for yi, scorei in zip(yb, scores)]
133+
return total_loss, sum(accuracy) / len(accuracy)
134+
135+
total_loss, acc = loss()
136+
print(total_loss, acc)
137+
</py-repl><br>
138+
<py-repl auto-generate="true">
139+
# optimization
140+
for k in range(20): #was 100. Accuracy can be further improved w/ more epochs (to 100%).
141+
142+
# forward
143+
total_loss, acc = loss()
144+
145+
# backward
146+
model.zero_grad()
147+
total_loss.backward()
148+
149+
# update (sgd)
150+
learning_rate = 1.0 - 0.9*k/100
151+
for p in model.parameters():
152+
p.data -= learning_rate * p.grad
153+
154+
if k % 1 == 0:
155+
print(f"step {k} loss {total_loss.data}, accuracy {acc*100}%")
156+
</py-repl><br>
157+
<div>
158+
<p>
159+
Please wait for the training loop above to complete. It will not print out stats until it
160+
has completely finished. This typically takes 1-2 min. <br><br>
161+
162+
Line 9 has been changed from: <br>
163+
<code>Z = np.array([s.data &gt; 0 for s in scores])</code><br>
164+
to: <br>
165+
<code>Z = np.array([(s.data).__gt__(0) for s in scores])</code><br>
166+
</p>
167+
</div>
168+
<py-repl auto-generate="true">
169+
h = 0.25
170+
x_min, x_max = X[:, 0].min() - 1, X[:, 0].max() + 1
171+
y_min, y_max = X[:, 1].min() - 1, X[:, 1].max() + 1
172+
xx, yy = np.meshgrid(np.arange(x_min, x_max, h),
173+
np.arange(y_min, y_max, h))
174+
Xmesh = np.c_[xx.ravel(), yy.ravel()]
175+
inputs = [list(map(Value, xrow)) for xrow in Xmesh]
176+
scores = list(map(model, inputs))
177+
Z = np.array([(s.data).__gt__(0) for s in scores])
178+
Z = Z.reshape(xx.shape)
179+
180+
fig = plt.figure()
181+
plt.contourf(xx, yy, Z, cmap=plt.cm.Spectral, alpha=0.8)
182+
plt.scatter(X[:, 0], X[:, 1], c=y, s=40, cmap=plt.cm.Spectral)
183+
plt.xlim(xx.min(), xx.max())
184+
plt.ylim(yy.min(), yy.max())
185+
plt
186+
</py-repl><br>
187+
<py-repl auto-generate="true">
188+
1+1
189+
</py-repl><br>
190+
</body>
191+
</html>
192+
<!-- Adapted by Mat Miller -->

pyscriptjs/examples/micrograd_ai.py

Lines changed: 142 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,142 @@
1+
#Credit: https://github.com/karpathy/micrograd/blob/master/demo.ipynb
2+
#cell
3+
import random
4+
import numpy as np
5+
import matplotlib.pyplot as plt
6+
import datetime
7+
8+
#cell
9+
from micrograd.engine import Value
10+
from micrograd.nn import Neuron, Layer, MLP
11+
12+
print_statements = []
13+
14+
def run_all_micrograd_demo(*args,**kwargs):
15+
result = micrograd_demo()
16+
pyscript.write('micrograd-run-all-fig2-div', result)
17+
18+
def print_div(o):
19+
o = str(o)
20+
print_statements.append(o + ' \n<br>')
21+
pyscript.write('micrograd-run-all-print-div', ''.join(print_statements))
22+
23+
#All code is wrapped in this run_all function so it optionally executed (called)
24+
#from pyscript when a button is pressed.
25+
def micrograd_demo(*args,**kwargs):
26+
"""
27+
Runs the micrograd demo.
28+
29+
*args and **kwargs do nothing and are only there to capture any parameters passed
30+
from pyscript when this function is called when a button is clicked.
31+
"""
32+
33+
#cell
34+
start = datetime.datetime.now()
35+
print_div('Starting...')
36+
37+
#cell
38+
np.random.seed(1337)
39+
random.seed(1337)
40+
41+
#cell
42+
#An adaptation of sklearn's make_moons function https://scikit-learn.org/stable/modules/generated/sklearn.datasets.make_moons.html
43+
def make_moons(n_samples=100, noise=None):
44+
n_samples_out, n_samples_in = n_samples, n_samples
45+
46+
outer_circ_x = np.cos(np.linspace(0, np.pi, n_samples_out))
47+
outer_circ_y = np.sin(np.linspace(0, np.pi, n_samples_out))
48+
inner_circ_x = 1 - np.cos(np.linspace(0, np.pi, n_samples_in))
49+
inner_circ_y = 1 - np.sin(np.linspace(0, np.pi, n_samples_in)) - 0.5
50+
51+
X = np.vstack([np.append(outer_circ_x, inner_circ_x), np.append(outer_circ_y, inner_circ_y)]).T
52+
y = np.hstack([np.zeros(n_samples_out, dtype=np.intp), np.ones(n_samples_in, dtype=np.intp)])
53+
if noise is not None: X += np.random.normal(loc=0.0, scale=noise, size=X.shape)
54+
return X, y
55+
X, y = make_moons(n_samples=100, noise=0.1)
56+
57+
#cell
58+
y = y*2 - 1 # make y be -1 or 1
59+
# visualize in 2D
60+
plt.figure(figsize=(5,5))
61+
plt.scatter(X[:,0], X[:,1], c=y, s=20, cmap='jet')
62+
plt
63+
pyscript.write('micrograd-run-all-fig1-div', plt)
64+
65+
#cell
66+
model = MLP(2, [16, 16, 1]) # 2-layer neural network
67+
print_div(model)
68+
print_div(("number of parameters", len(model.parameters())))
69+
70+
#cell
71+
# loss function
72+
def loss(batch_size=None):
73+
# inline DataLoader :)
74+
if batch_size is None:
75+
Xb, yb = X, y
76+
else:
77+
ri = np.random.permutation(X.shape[0])[:batch_size]
78+
Xb, yb = X[ri], y[ri]
79+
inputs = [list(map(Value, xrow)) for xrow in Xb]
80+
81+
# forward the model to get scores
82+
scores = list(map(model, inputs))
83+
84+
# svm "max-margin" loss
85+
losses = [(1 + -yi*scorei).relu() for yi, scorei in zip(yb, scores)]
86+
data_loss = sum(losses) * (1.0 / len(losses))
87+
# L2 regularization
88+
alpha = 1e-4
89+
reg_loss = alpha * sum((p*p for p in model.parameters()))
90+
total_loss = data_loss + reg_loss
91+
92+
# also get accuracy
93+
accuracy = [((yi).__gt__(0)) == ((scorei.data).__gt__(0)) for yi, scorei in zip(yb, scores)]
94+
return total_loss, sum(accuracy) / len(accuracy)
95+
96+
total_loss, acc = loss()
97+
print((total_loss, acc))
98+
99+
#cell
100+
# optimization
101+
for k in range(20): #was 100
102+
103+
# forward
104+
total_loss, acc = loss()
105+
106+
# backward
107+
model.zero_grad()
108+
total_loss.backward()
109+
110+
# update (sgd)
111+
learning_rate = 1.0 - 0.9*k/100
112+
for p in model.parameters():
113+
p.data -= learning_rate * p.grad
114+
115+
if k % 1 == 0:
116+
# print(f"step {k} loss {total_loss.data}, accuracy {acc*100}%")
117+
print_div(f"step {k} loss {total_loss.data}, accuracy {acc*100}%")
118+
119+
#cell
120+
h = 0.25
121+
x_min, x_max = X[:, 0].min() - 1, X[:, 0].max() + 1
122+
y_min, y_max = X[:, 1].min() - 1, X[:, 1].max() + 1
123+
xx, yy = np.meshgrid(np.arange(x_min, x_max, h),
124+
np.arange(y_min, y_max, h))
125+
Xmesh = np.c_[xx.ravel(), yy.ravel()]
126+
inputs = [list(map(Value, xrow)) for xrow in Xmesh]
127+
scores = list(map(model, inputs))
128+
Z = np.array([(s.data).__gt__(0) for s in scores])
129+
Z = Z.reshape(xx.shape)
130+
131+
fig = plt.figure()
132+
plt.contourf(xx, yy, Z, cmap=plt.cm.Spectral, alpha=0.8)
133+
plt.scatter(X[:, 0], X[:, 1], c=y, s=40, cmap=plt.cm.Spectral)
134+
plt.xlim(xx.min(), xx.max())
135+
plt.ylim(yy.min(), yy.max())
136+
137+
finish = datetime.datetime.now()
138+
# print(f"It took {(finish-start).seconds} seconds to run this code.")
139+
print_div(f"It took {(finish-start).seconds} seconds to run this code.")
140+
141+
plt
142+
return plt

0 commit comments

Comments
 (0)