Skip to content

Added types of optimizers #559

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
May 27, 2024
Merged

Added types of optimizers #559

merged 3 commits into from
May 27, 2024

Conversation

manishh12
Copy link
Contributor

PR Description

The chat discusses various optimization algorithms used in machine learning, particularly focusing on gradient-based methods. Each optimization algorithm is explained with its mathematical formulation, intuition, advantages, disadvantages, and a corresponding Python implementation. These algorithms include Gradient Descent, Stochastic Gradient Descent (SGD), Mini-Batch Gradient Descent, Momentum, Nesterov Accelerated Gradient (NAG), AdaGrad, RMSprop, and Adam. The discussion provides a comprehensive understanding of optimization techniques, their practical implementations, and considerations for choosing the appropriate algorithm based on different scenarios and problem domains.

Related Issues

Checklist

  • I have gone through the contributing guide
  • I have updated my branch and synced it with project main branch before making this PR

Undertaking

I declare that:

  1. The content I am submitting is original and has not been plagiarized.
  2. No portion of the work has been copied from any other source without proper attribution.
  3. The work has been checked for plagiarism, and I assure its authenticity.

I understand that any violation of this undertaking may have legal consequences that I will bear and could result in the withdrawal of any recognition associated with the work.

  • I Agree

@animator
Copy link
Owner

The formulas are not coming properly.
Learn how you can write formulas in markdown.

@manishh12
Copy link
Contributor Author

The formulas are not coming properly. Learn how you can write formulas in markdown.

@animator Done with the update. Kindly review it.

@manishh12
Copy link
Contributor Author

The formulas are not coming properly. Learn how you can write formulas in markdown.

@animator Done with the update. Kindly review it.

@animator Kindly review it.

@animator animator merged commit c9b8146 into animator:main May 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Addition of Optimizers in ML
2 participants