-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
[MRG+1] - Voting classifier flatten transform (Continuation) #9188
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[MRG+1] - Voting classifier flatten transform (Continuation) #9188
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You've not tested the warning code, not silenced any warnings currently produced in testing or examples
@@ -163,6 +175,10 @@ def fit(self, X, y, sample_weight=None): | |||
if n_isnone == len(self.estimators): | |||
raise ValueError('All estimators are None. At least one is ' | |||
'required to be a classifier!') | |||
|
|||
if not self.flatten_transform and self.voting is 'soft': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This condition means the user who wants flatten_transform=False will always get the warning. We can avoid this by defaulting to some special marker, eg 'default'. We can tell the user in the warning message that setting it explicitly will Silence the warning
Otherwise it's looking good |
bf29f5f
to
0926b4c
Compare
Thanks @jnothman Changes have been made. |
doc/whats_new.rst
Outdated
|
||
- Added ``flatten_transform`` parameter to :class:`ensemble.VotingClassifier` | ||
to change output shape of `transform` method to 2 dimensional. | ||
:issue:`7794` by `Ibraim Ganiev <olologin>` and |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing :user:
here and in the next line
" Setting it explicitly will silence this warning", | ||
DeprecationWarning) | ||
warnings.warn("'flatten_transform' default value will be " | ||
"changed to True in 0.21.", DeprecationWarning) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just meant to add here: To silence this warning you may explicitly set flatten_transform=False.
DeprecationWarning) | ||
warnings.warn("'flatten_transform' default value will be " | ||
"changed to True in 0.21.", DeprecationWarning) | ||
self.flatten_transform = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We shouldn't be modifying parameters. Just handle the default case in transform. In fact it makes sense to only raise this warning in transform as it won't be relevant to the majority of users who only care about prediction
I moved the warning into |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, of course the full deprecation warning belongs in transform
The whole idea here is to maintain backwards compatibility, but to warn the users that they need to change their code to either maintain the old behaviour or adopt the new behaviour. Imagine using |
if isinstance(self.flatten_transform, | ||
str) and self.flatten_transform == 'default': | ||
warnings.warn("To silence this warning you may" | ||
" explicitly set flatten_transform=False", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please merge these warnings into one
probas = self._collect_probas(X) | ||
if isinstance(self.flatten_transform, | ||
str) and self.flatten_transform == 'default': | ||
warnings.warn("To silence this warning you may" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Surely the first sentence only makes sense after the second.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You've got it. LGTM!
Thanks for all your review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good apart from nitpicks.
doc/whats_new.rst
Outdated
@@ -284,6 +289,17 @@ Model evaluation and meta-estimators | |||
- Added ``sample_weight`` parameter to :meth:`pipeline.Pipeline.score`. | |||
:issue:`7723` by :user:`Mikhail Korobov <kmike>`. | |||
|
|||
- ``check_estimator`` now attempts to ensure that methods transform, predict, etc. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems unrelated.
@@ -61,6 +62,12 @@ class VotingClassifier(_BaseComposition, ClassifierMixin, TransformerMixin): | |||
The number of jobs to run in parallel for ``fit``. | |||
If -1, then the number of jobs is set to the number of cores. | |||
|
|||
flatten_transform : bool, optional (default='default') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shouldn't it be None by default by convention?
flatten_transform : bool, optional (default='default') | ||
Affects shape of transform output only when voting='soft' | ||
If voting='soft' and flatten_transform=True, transform method returns | ||
matrix with shape (n_samples, n_classifiers * n_classes) instead of |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe instead of "instead" say "if flatten_transform=False it returns"...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe say what the current default behavior is and that it will change in the future. You can also use a versionadded
sphinx directive here.
@@ -256,16 +269,30 @@ def transform(self, X): | |||
|
|||
Returns | |||
------- | |||
If `voting='soft'`: | |||
array-like = [n_classifiers, n_samples, n_classes] | |||
If `voting='soft'` and `flatten_transform=False`: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And what if flatten_transform is True?
voting='soft', | ||
flatten_transform=False).fit(X, y) | ||
|
||
assert_array_equal(eclf1.transform(X).shape, (3, 4, 2)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you wrap this in an assert_warns_message
? Right now the warning is raised, and assert_warns_message
actually returns the value, so you can then compare the shapes.
2e388bc
to
f1de47f
Compare
Do we want to backport this? Otherwise we need to move the whatsnew entry and change the deprecation version. |
I'm okay with backporting, but I'm not about to do it
…On 22 Jul 2017 5:50 am, "Andreas Mueller" ***@***.***> wrote:
Do we want to backport this? Otherwise we need to move the whatsnew entry
and change the deprecation version.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#9188 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAEz6yKujO14P7AJ5jgSEEsBN7Evr5XBks5sQQDzgaJpZM4OANwo>
.
|
@jnothman ok. I'll do all backporting in one go just before we release, I think. |
Reference Issue
Fixes #7230, continuation of #7794
What does this implement/fix? Explain your changes.
Improve tests, fixes docstring and pep8 error