You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+2-1Lines changed: 2 additions & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -55,14 +55,15 @@ Principal Component Analysis (PCA) is a dimensionality reduction technique used
55
55
* Computation Efficiency:
56
56
As ReLU is a simple threshold the forward and backward path will be faster.
57
57
* Reduced Likelihood of Vanishing Gradient:
58
-
Gradient of ReLU is 1 for positive values and 0 for negative values.
58
+
Gradient of ReLU is 1 for positive values and 0 for negative values while Sigmoid activation saturates (gradients close to 0) quickly with slightly higher or lower inputs leading to vanishing gradients.
59
59
* Sparsity:
60
60
Sparsity happens when the input of ReLU is negative. This means fewer neurons are firing ( sparse activation ) and the network is lighter.
#### 8) Given stride S and kernel sizes for each layer of a (1-dimensional) CNN, create a function to compute the [receptive field](https://www.quora.com/What-is-a-receptive-field-in-a-convolutional-neural-network) of a particular node in the network. This is just finding how many input nodes actually connect through to a neuron in a CNN. [[src](https://www.reddit.com/r/computervision/comments/7gku4z/technical_interview_questions_in_cv/)]
67
68
68
69
The receptive field are defined portion of space within an inputs that will be used during an operation to generate an output.
0 commit comments