Skip to content

Commit 3704876

Browse files
authored
remove more spaces
1 parent 59667bd commit 3704876

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

doc/gettingstarted.txt

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -331,7 +331,7 @@ The NLL of our classifier is a differentiable surrogate for the zero-one loss,
331331
and we use the gradient of this function over our training data as a
332332
supervised learning signal for deep learning of a classifier.
333333

334-
This can be computed using the following line of code :
334+
This can be computed using the following line of code:
335335

336336
.. code-block:: python
337337

@@ -357,7 +357,7 @@ algorithm in which we repeatedly make small steps downward on an error
357357
surface defined by a loss function of some parameters.
358358
For the purpose of ordinary gradient descent we consider that the training
359359
data is rolled into the loss function. Then the pseudocode of this
360-
algorithm can be described as :
360+
algorithm can be described as:
361361

362362
.. code-block:: python
363363

@@ -425,7 +425,7 @@ but this choice is almost arbitrary (though harmless).
425425
to the batch size used.
426426

427427
All code-blocks above show pseudocode of how the algorithm looks like. Implementing such
428-
algorithm in Theano can be done as follows :
428+
algorithm in Theano can be done as follows:
429429

430430
.. code-block:: python
431431

0 commit comments

Comments
 (0)