Skip to content

[MRG] Relaxed criterion specification so tree plot does not force calculate entropy for any DecisionTreeClassifier #13947

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 14 commits into from
May 27, 2019

Conversation

fhoang7
Copy link
Contributor

@fhoang7 fhoang7 commented May 25, 2019

Reference Issues/PRs

Fixes #13944

What does this implement/fix? Explain your changes.

Removed hard coding of criterion in tree making function. Instead, passed in criterion as an additional input into the function and modified subsequent function calls accordingly. export.py test file was modified to ensure that all tests passed as a result of this change.

Any other comments?

Need feedback on my approach and potential next steps. I would assume that the next thing to do would be to write a separate test to ensure that the proper decision tree criteria will be printed when entropy is the desired metric.

@fhoang7 fhoang7 changed the title [WIP] Relaxed criterion specification so plot_tree no longer writes entropy… [WIP] Relaxed criterion specification so tree plot does not force calculate entropy for any DecisionTreeClassifier May 25, 2019
@fhoang7 fhoang7 changed the title [WIP] Relaxed criterion specification so tree plot does not force calculate entropy for any DecisionTreeClassifier [MRG] Relaxed criterion specification so tree plot does not force calculate entropy for any DecisionTreeClassifier May 25, 2019
Copy link
Member

@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

I think we'd like to release this in 0.21.3, but it's not clear to me whether that release will be necessary, or whether we should just put this in 0.22 for now. I'd say, for now, we'll create the 0.21.3 section, because it's still very early to exclude needing another patch release.

Please merge master and add an entry to the 0.21.3 change log at doc/whats_new/v0.21.rst. Like the other entries there, please reference this pull request with :pr: and credit yourself (and other contributors if applicable) with :user:.

@fhoang7 fhoang7 mentioned this pull request May 26, 2019
@fhoang7
Copy link
Contributor Author

fhoang7 commented May 26, 2019

@jnothman How do I merge master? I thought that someone with write access could do it.

@jnothman
Copy link
Member

jnothman commented May 26, 2019 via email

@fhoang7
Copy link
Contributor Author

fhoang7 commented May 26, 2019

Merged master and added to change log like you suggested. Is there anything else I need to follow up @jnothman or is it out of my hands now? Also, wasn't sure if I added to the change log correctly. Didn't see anybody else make pull requests to amend the change log.

@jnothman
Copy link
Member

I was hoping you would amend the change log in this pull request (you can just merge your changs log branch into this one). The reason for merging master was only that I'd changed the change log.

@fhoang7
Copy link
Contributor Author

fhoang7 commented May 26, 2019

Done! Merged my other request into this one.

fhoang7 added 4 commits May 26, 2019 20:48
Initialized gain variable with parent node negative loss and removing from _split_gain function in order to improve performance of splitting.
@fhoang7
Copy link
Contributor Author

fhoang7 commented May 27, 2019

Reverted 2 commits since I was developing for another issue on this branch. Moved those changes to a separate branch.

Copy link
Member

@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @fhoang7

@jnothman jnothman merged commit fa383a4 into scikit-learn:master May 27, 2019
@amueller
Copy link
Member

Thanks!

jnothman pushed a commit to jnothman/scikit-learn that referenced this pull request Jun 24, 2019
koenvandevelde pushed a commit to koenvandevelde/scikit-learn that referenced this pull request Jul 12, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

plot_tree always writes entropy even if gini is used
5 participants