Skip to content

Commit a376f41

Browse files
committed
Change references to SigmoidalLayer to HiddenLayer
The SigmoidalLayer and TahnLayer classes seem to have been merged into the HiddenLayer with has an activation function attribute. Several places in the tutorial still refer to the SigmoidalLayer, so I updated them to HiddenLayer and took care to specify the T.nnet.sigmoid activation function where appropriate.
1 parent e0ce8d0 commit a376f41

File tree

4 files changed

+11
-10
lines changed

4 files changed

+11
-10
lines changed

code/mlp.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -108,7 +108,7 @@ class MLP(object):
108108
A multilayer perceptron is a feedforward artificial neural network model
109109
that has one layer or more of hidden units and nonlinear activations.
110110
Intermediate layers usually have as activation function thanh or the
111-
sigmoid function (defined here by a ``SigmoidalLayer`` class) while the
111+
sigmoid function (defined here by a ``HiddenLayer`` class) while the
112112
top layer is a softamx layer (defined here by a ``LogisticRegression``
113113
class).
114114
"""
@@ -136,10 +136,10 @@ def __init__(self, rng, input, n_in, n_hidden, n_out):
136136
137137
"""
138138

139-
# Since we are dealing with a one hidden layer MLP, this will
140-
# translate into a TanhLayer connected to the LogisticRegression
141-
# layer; this can be replaced by a SigmoidalLayer, or a layer
142-
# implementing any other nonlinearity
139+
# Since we are dealing with a one hidden layer MLP, this will translate
140+
# into a HiddenLayer with a tanh activation function connected to the
141+
# LogisticRegression layer; the activation function can be replaced by
142+
# sigmoid or any other nonlinear function
143143
self.hiddenLayer = HiddenLayer(rng=rng, input=input,
144144
n_in=n_in, n_out=n_hidden,
145145
activation=T.tanh)

doc/DBN.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -189,7 +189,7 @@ the MLP, while ``self.rbm_layers`` will store the RBMs used to pretrain each
189189
layer of the MLP.
190190

191191
Next step, we construct ``n_layers`` sigmoid layers (we use the
192-
``SigmoidalLayer`` class introduced in :ref:`mlp`, with the only modification
192+
``HiddenLayer`` class introduced in :ref:`mlp`, with the only modification
193193
that we replaced the non-linearity from ``tanh`` to the logistic function
194194
:math:`s(x) = \frac{1}{1+e^{-x}}`) and ``n_layers`` RBMs, where ``n_layers``
195195
is the depth of our model. We link the sigmoid layers such that they form an

doc/SdA.txt

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -126,7 +126,7 @@ representations of intermediate layers of the MLP.
126126
``self.dA_layers`` will store the denoising autoencoder associated with the layers of the MLP.
127127

128128
Next step, we construct ``n_layers`` sigmoid layers (we use the
129-
``SigmoidalLayer`` class introduced in :ref:`mlp`, with the only
129+
``HiddenLayer`` class introduced in :ref:`mlp`, with the only
130130
modification that we replaced the non-linearity from ``tanh`` to the
131131
logistic function :math:`s(x) = \frac{1}{1+e^{-x}}`) and ``n_layers``
132132
denoising autoencoders, where ``n_layers`` is the depth of our model.
@@ -154,10 +154,11 @@ bias of the encoding part with its corresponding sigmoid layer.
154154
else:
155155
layer_input = self.sigmoid_layers[-1].output
156156

157-
sigmoid_layer = SigmoidalLayer(rng=rng,
157+
sigmoid_layer = HiddenLayer(rng=rng,
158158
input=layer_input,
159159
n_in=input_size,
160-
n_out=hidden_layers_sizes[i])
160+
n_out=hidden_layers_sizes[i],
161+
activation=T.nnet.sigmoid)
161162
# add the layer to our list of layers
162163
self.sigmoid_layers.append(sigmoid_layer)
163164

doc/lenet.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -498,7 +498,7 @@ instantiate the network as follows.
498498
image_shape=(batch_size, 20, 12, 12),
499499
filter_shape=(50, 20, 5, 5), poolsize=(2, 2))
500500

501-
# the SigmoidalLayer being fully-connected, it operates on 2D matrices of
501+
# the HiddenLayer being fully-connected, it operates on 2D matrices of
502502
# shape (batch_size,num_pixels) (i.e matrix of rasterized images).
503503
# This will generate a matrix of shape (20, 32 * 4 * 4) = (20, 512)
504504
layer2_input = layer1.output.flatten(2)

0 commit comments

Comments
 (0)