Added types of optimizers #559
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
PR Description
The chat discusses various optimization algorithms used in machine learning, particularly focusing on gradient-based methods. Each optimization algorithm is explained with its mathematical formulation, intuition, advantages, disadvantages, and a corresponding Python implementation. These algorithms include Gradient Descent, Stochastic Gradient Descent (SGD), Mini-Batch Gradient Descent, Momentum, Nesterov Accelerated Gradient (NAG), AdaGrad, RMSprop, and Adam. The discussion provides a comprehensive understanding of optimization techniques, their practical implementations, and considerations for choosing the appropriate algorithm based on different scenarios and problem domains.
Related Issues
Checklist
main
branch before making this PRUndertaking
I declare that:
I understand that any violation of this undertaking may have legal consequences that I will bear and could result in the withdrawal of any recognition associated with the work.