Skip to content

ENH Adding variable force_alpha to classes in naive_bayes.py #22269

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 79 commits into from
Jul 22, 2022

Conversation

Micky774
Copy link
Contributor

@Micky774 Micky774 commented Jan 22, 2022

Reference Issues/PRs
Fixes #10772
Resolves #10775 (stalled)
Resolves #16747 (stalled)
Resolves #18805 (stalled)

What does this implement/fix? Explain your changes.
This PR takes over stalled PR #18805.

From the description of #16747 and #18805: "This PR adds a new variable alphaCorrection in classes in naive_bayes.py, which is set to True by default and if set to False, then for alpha=0 (or greater, but still smaller than _ALPHA_MIN) alpha is not being rounded up to _ALPHA_MIN."

This PR updated minor version details in documentation as well as began a double-deprecation cycle, initially adding a force_alpha=False keyword and begins a deprecation cycle to change its default to True. After completion of this default change, a new deprecation cycle will begin to remove force_alpha.

Any other comments?
Follow-up PR to begin deprecation for removal of force_alpha

arka204 and others added 30 commits March 22, 2020 11:06
Merging changes from the main repository
Merging changes from the main repository
Merging changes from the main repository
…ernoulliNB-and-MultinomialNB-when-alpha-is-close-or-equal-0

# Conflicts:
#	doc/whats_new/v0.24.rst
#	sklearn/naive_bayes.py
Co-authored-by: Julien Jerphanion <git@jjerphan.xyz>
Co-authored-by: Julien Jerphanion <git@jjerphan.xyz>
Copy link
Member

@thomasjpfan thomasjpfan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For the force_alpha=True everywhere in tests, I prefer pytestmark = pytest.filterwarnings in this case.

In many cases force_alpha=True does not change anything because the default alpha is 1.0.

Co-authored-by: Thomas J. Fan <thomasjpfan@gmail.com>
Copy link
Member

@jjerphan jjerphan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thank you, @Micky774.

Micky774 and others added 2 commits July 20, 2022 08:26
@thomasjpfan thomasjpfan changed the title [MRG] Adding variable force_alpha to classes in naive_bayes.py ENH Adding variable force_alpha to classes in naive_bayes.py Jul 20, 2022
Copy link
Member

@thomasjpfan thomasjpfan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor nits, otherwise LGTM

Co-authored-by: Thomas J. Fan <thomasjpfan@gmail.com>
@jjerphan jjerphan merged commit 65b300e into scikit-learn:main Jul 22, 2022
greglandrum added a commit to rdkit/laplaciannb that referenced this pull request Jan 6, 2023
The NaiveBayes classifiers now have a force_alpha attribute, which is explained in:
scikit-learn/scikit-learn#22269

This PR just gets the code working; it would be good to invest some time into reading the details of that sklearn PR and making the appropriate adjustments.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
No open projects
Status: Done
Development

Successfully merging this pull request may close these issues.

BernoulliNB and MultinomialNB documentation for alpha=0
5 participants