Skip to content

Commit 159cf5a

Browse files
committed
Fix review comments
1 parent f452ab3 commit 159cf5a

File tree

1 file changed

+27
-27
lines changed

1 file changed

+27
-27
lines changed

samples/core/guide/autograph.ipynb

+27-27
Original file line numberDiff line numberDiff line change
@@ -13,8 +13,8 @@
1313
"toc_visible": true
1414
},
1515
"kernelspec": {
16-
"name": "python3",
17-
"display_name": "Python 3"
16+
"name": "python2",
17+
"display_name": "Python 2"
1818
}
1919
},
2020
"cells": [
@@ -191,7 +191,11 @@
191191
"source": [
192192
"## Automatically convert Python control flow\n",
193193
"\n",
194-
"AutoGraph will convert much of the Python language into the equivalent TensorFlow graph building code. It converts a function like:"
194+
"AutoGraph will convert much of the Python language into the equivalent TensorFlow graph building code. \n",
195+
"\n",
196+
"Note: In real applications batching is essential for performance. The best code to convert to AutoGraph is code where the control flow is decided at the _batch_ level. If making decisions at the individual _example_ level, you must index and batch the examples to maintain performance while applying the control flow logic. \n",
197+
"\n",
198+
"AutoGraph converts a function like:"
195199
]
196200
},
197201
{
@@ -321,9 +325,8 @@
321325
"\n",
322326
"tf_sum_even = autograph.to_graph(sum_even)\n",
323327
"\n",
324-
"with tf.Graph().as_default(): \n",
325-
" with tf.Session():\n",
326-
" print('Graph result: %d\\n\\n' % tf_sum_even(tf.constant([10,12,15,20])).eval())"
328+
"with tf.Graph().as_default(), tf.Session() as sess:\n",
329+
" print('Graph result: %d\\n\\n' % sess.run(tf_sum_even(tf.constant([10,12,15,20]))))"
327330
],
328331
"execution_count": 0,
329332
"outputs": []
@@ -393,7 +396,7 @@
393396
"source": [
394397
"## Examples\n",
395398
"\n",
396-
"Let's demonstrate some useful Python language features."
399+
"Let's demonstrate some useful Python language features.\n"
397400
]
398401
},
399402
{
@@ -419,14 +422,13 @@
419422
"@autograph.convert()\n",
420423
"def inverse(x):\n",
421424
" assert x != 0.0, 'Do not pass zero!'\n",
422-
" return 1.0/x\n",
425+
" return 1.0 / x\n",
423426
"\n",
424-
"with tf.Graph().as_default(): \n",
425-
" with tf.Session():\n",
426-
" try:\n",
427-
" print(inverse(tf.constant(0.0)).eval())\n",
428-
" except tf.errors.InvalidArgumentError as e:\n",
429-
" print('Got error message:\\n %s' % e.message)"
427+
"with tf.Graph().as_default(), tf.Session() as sess:\n",
428+
" try:\n",
429+
" print(sess.run(inverse(tf.constant(0.0))))\n",
430+
" except tf.errors.InvalidArgumentError as e:\n",
431+
" print('Got error message:\\n %s' % e.message)"
430432
],
431433
"execution_count": 0,
432434
"outputs": []
@@ -459,9 +461,8 @@
459461
" i += 1\n",
460462
" return n\n",
461463
" \n",
462-
"with tf.Graph().as_default():\n",
463-
" with tf.Session():\n",
464-
" count(tf.constant(5)).eval()"
464+
"with tf.Graph().as_default(), tf.Session() as sess:\n",
465+
" sess.run(count(tf.constant(5)))"
465466
],
466467
"execution_count": 0,
467468
"outputs": []
@@ -499,9 +500,8 @@
499500
" return autograph.stack(z) \n",
500501
"\n",
501502
"\n",
502-
"with tf.Graph().as_default(): \n",
503-
" with tf.Session():\n",
504-
" print(arange(tf.constant(10)).eval())"
503+
"with tf.Graph().as_default(), tf.Session() as sess:\n",
504+
" sess.run(arange(tf.constant(10)))"
505505
],
506506
"execution_count": 0,
507507
"outputs": []
@@ -655,14 +655,14 @@
655655
"source": [
656656
"## Interoperation with `tf.Keras`\n",
657657
"\n",
658-
"Now that you've seen the basics, let's build some real model components with autograph.\n",
658+
"Now that you've seen the basics, let's build some model components with autograph.\n",
659659
"\n",
660-
"It's relatively simple to integrate `autograph` with `tf.keras`. But remember that batchng is essential for performance. So the best candidate code for conversion to autograph is code where the control flow is decided at the _batch_ level. If decisions are made at the individual _example_ level you will still need to index and batch your examples to maintain performance while appling the control flow logic. \n",
660+
"It's relatively simple to integrate `autograph` with `tf.keras`. \n",
661661
"\n",
662662
"\n",
663663
"### Stateless functions\n",
664664
"\n",
665-
"For stateless functions like `collatz`, below, the easiest way to include them in a keras model is to wrap them up as a layer uisng `tf.keras.layers.Lambda`."
665+
"For stateless functions, like `collatz` shown below, the easiest way to include them in a keras model is to wrap them up as a layer uisng `tf.keras.layers.Lambda`."
666666
]
667667
},
668668
{
@@ -711,7 +711,7 @@
711711
"\n",
712712
"<!--TODO(markdaoust) link to full examples or these referenced models.-->\n",
713713
"\n",
714-
"The easiest way to use autograph is keras layers and models is to `@autograph.convert()` the `call` method. See the [keras guide](https://tensorflow.org/guide/keras#build_advanced_models) for details on how to build on these classes. \n",
714+
"The easiest way to use AutoGraph with Keras layers and models is to `@autograph.convert()` the `call` method. See the [TensorFlow Keras guide](https://tensorflow.org/guide/keras#build_advanced_models) for details on how to build on these classes. \n",
715715
"\n",
716716
"Here is a simple example of the [stocastic network depth](https://arxiv.org/abs/1603.09382) technique :"
717717
]
@@ -866,9 +866,9 @@
866866
"source": [
867867
"## Advanced example: An in-graph training loop\n",
868868
"\n",
869-
"Since writing control flow in AutoGraph is easy, running a training loop in a TensorFlow graph should also be easy. \n",
869+
"The previous section showed that AutoGraph can be used inside Keras layers and models. Keras models can also be used in AutoGraph code.\n",
870870
"\n",
871-
"Important: While this example wraps a `tf.keras.Model` using AutoGraph, `tf.contrib.autograph` is compatible with `tf.keras` and can be used in [Keras custom layers and models](https://tensorflow.org/guide/keras#build_advanced_models). \n",
871+
"Since writing control flow in AutoGraph is easy, running a training loop in a TensorFlow graph should also be easy. \n",
872872
"\n",
873873
"This example shows how to train a simple Keras model on MNIST with the entire training process—loading batches, calculating gradients, updating parameters, calculating validation accuracy, and repeating until convergence—is performed in-graph."
874874
]
@@ -1083,4 +1083,4 @@
10831083
"outputs": []
10841084
}
10851085
]
1086-
}
1086+
}

0 commit comments

Comments
 (0)