Skip to content

Commit 65d0d7b

Browse files
author
Mofan Zhou
committed
update theano TUT
1 parent bbeb57b commit 65d0d7b

File tree

2 files changed

+4
-4
lines changed

2 files changed

+4
-4
lines changed

theanoTUT/theano5_function.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,12 +13,12 @@
1313

1414
# activation function example
1515
x = T.dmatrix('x')
16-
s = 1 / (1 + T.exp(-x))
16+
s = 1 / (1 + T.exp(-x)) # logistic or soft step
1717
logistic = theano.function([x], s)
1818
print(logistic([[0, 1],[-1, -2]]))
1919

2020
# multiply outputs for a function
21-
a , b = T.dmatrices('a', 'b')
21+
a, b = T.dmatrices('a', 'b')
2222
diff = a - b
2323
abs_diff = abs(diff)
2424
diff_squared = diff ** 2

theanoTUT/theano7_activation_function.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,9 @@
77

88
"""
99
The available activation functions in theano can be found in this link:
10-
https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0ahUKEwjjhYfkvJPOAhVDto8KHVciCqUQFgghMAA&url=http%3A%2F%2Fdeeplearning.net%2Fsoftware%2Ftheano%2Flibrary%2Ftensor%2Fnnet%2Fnnet.html&usg=AFQjCNE_BGK4eZpKd0vBdsn6mSjtAhBhsA&sig2=F57jVSqVcHq-M505idTbaw
10+
http://deeplearning.net/software/theano/library/tensor/nnet/nnet.html
1111
12-
It includes but not limited to softplus, sigmoid, relu, softmax, elu, tanh...
12+
The activation functions include but not limited to softplus, sigmoid, relu, softmax, elu, tanh...
1313
1414
For the hidden layer, we could use relu, tanh...
1515
For classification problems, we could use softplus or softmax for the output layer.

0 commit comments

Comments
 (0)