Artificial Neural Network Notes
Artificial Neural Network Notes
Deep Learning
Artificial Neuron
Calculation (OR):
Calculation (XOR):
First:
ERROR (LOSS)
Gradient
Gradient refers to the rate of change of a function with respect to its variables.
In neural networks, it is used to measure how much a small change in weights
affects the error, guiding the optimization process during training to minimize
the loss.
Second row:
Third row:
First row:
Third row:
Backpropagation
Backpropagation is a training algorithm in neural networks that calculates the
error’s gradient for each weight by propagating the error backward from the
output layer to the input layer, allowing the model to adjust the weights and
minimize the error through gradient descent.
Second:
Third:
First:
Third:
BIAS
Bias is an additional parameter in neural networks that helps adjust the output
along with the weighted sum of inputs. It allows the model to better fit the data
by shifting the activation function, enabling it to handle patterns that do not
pass through the origin.
Sigmod Function
Linear Function