0% found this document useful (0 votes)
18 views1 page

Back Propagation

Uploaded by

realmex7max5g
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views1 page

Back Propagation

Uploaded by

realmex7max5g
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Backpropagation is a supervised learning algorithm used to train artificial neural networks by

adjusting the weights to minimize the error. Here’s a step-by-step explanation of the backpropagation
algorithm:
1. Forward Propagation
Input to Output: Pass the input data through the neural network layer by layer to compute the
predicted output ().
Activation Functions: Apply activation functions at each layer.
Error Calculation: Compute the loss (error) between the predicted output () and the actual output ()
using a loss function like Mean Squared Error (MSE) or Cross-Entropy Loss.
2. Backward Propagation of Error
This step involves calculating gradients of the loss with respect to the weights using the chain
rule of calculus.
a. Compute the Error at the Output Layer
Calculate the derivative of the loss function with respect to the output activation. Where is the
derivative of the activation function at the output layer.
b. Propagate the Error Backward
For each hidden layer , compute the error gradient: where:
: Weights connecting layer to the next layer.
: Error term of the next layer.
: Derivative of the activation function for the current layer.
3. Weight Updates
Use the gradients computed to update the weights and biases: where:
: Learning rate (step size for updates).
: Gradient of loss with respect to weights.
: Gradient of loss with respect to biases.
4. Repeat for Multiple Iterations (Epochs)
Perform forward propagation, calculate the loss, and execute backpropagation to update weights for
all training examples.
Repeat this process over multiple iterations (epochs) until the loss converges or meets a predefined
threshold.

You might also like