-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
[MRG+1] Housekeeping Deprecations for v0.19 #7927
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
This again convinces me that #7618 would be a good idea (as default behavior even). The SVC changes a default parameter and I have to change doctests all over the project. ?! |
You're right. We should make more changes to public interfaces to save your maintenance effort :P |
@jnothman you consider Also I just really really dislike our current |
I was only joking here. I'm coming to agree with you that we should change the default |
Should be good now. |
Can we decide to release 0.19 by early April 2017? We have ~4 months... |
Yeah we should release April 1st and announce that But sounds good for a target date. If we can make it depends obviously on a lot of factors (I'll be teaching, for example). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, class_weight
auto should be removed from sklearn/linear_model/stochastic_gradient.py:498
Some docstrings in ..versionadded
/..versionchanged
reference the old deprecated value. Should we remove the trace of the old functionality? This affects 'auto' class_weight
, and scaler's *scale_* is recommended instead of deprecated *std_*
Otherwise I think this is good.
@@ -1360,23 +1360,6 @@ Low-level methods | |||
Recently deprecated | |||
=================== | |||
|
|||
To be removed in 0.19 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ProjectedGradientNMF was missing here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we list them again in the changelog? Or is that overkill?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure what you're proposing, but I assume it's overkill :P
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I mean "should 0.19 list everything that was removed". It might be useful but maybe not necessarily part of this PR ;) Do I have your +1?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't worry about it. Yes, you have my +1. I can't be bothered going through the details a second time!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Who's the dodgy one now ;)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm relying on the second reviewer being less dodgy??
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Look, I think I did a pretty good job of catching some omissions here...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
True ;)
@@ -425,20 +425,6 @@ def fit(self, X, y, store_covariance=None, tol=None): | |||
y : array, shape (n_samples,) | |||
Target values. | |||
""" | |||
if store_covariance: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Investigating any leftovers from this, I've discovered that QDA has store_covariances
unlike LDA's store_covariance
:((
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
open issue?
@@ -48,25 +47,16 @@ def compute_class_weight(class_weight, classes, y): | |||
if class_weight is None or len(class_weight) == 0: | |||
# uniform class weights | |||
weight = np.ones(classes.shape[0], dtype=np.float64, order='C') | |||
elif class_weight in ['auto', 'balanced']: | |||
elif class_weight == 'balanced': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are still two uses of 'auto' in compute_sample_weight
and 1 more in its documentation.
LGTM |
@@ -425,20 +425,6 @@ def fit(self, X, y, store_covariance=None, tol=None): | |||
y : array, shape (n_samples,) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just a niptick but don't you need to change l414 - l418 to say that store_covariance
and tol
have been removed as you've done here ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't actually have a policy on that, do we? But yeah, would be better I guess.
if tree in SPARSE_TREES: | ||
yield (check_sparse_input, tree, dataset, 2) | ||
for tree_type, dataset in product(SPARSE_TREES, ["boston", "reg_small"]): | ||
if tree_type in REG_TREES: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand why you change that ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
because tree overwrites the tree module that was imported. And I (and flake8) don't like overwriting file-level module imports with local variable names.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good for me :)
Do we have to update what's new ? |
caed135
to
af49979
Compare
done |
It's ok for me :) |
thanks for the reviews :) |
Good bye 1200 lines of code 🎉 |
* remove stuff to be removed 0.19 * more changes * remove classes from 0.19 whatsnew * remove _LearntSelectorMixin * remove ProjectedGradientNMF, load_lwf_* * minor fixes * remove more copy from logistic regression path * remove lda, qda from __init__.__all__ * remove pg solver in nmf from tests etc * remove class_weight="auto" from tests * doctest change for decision_function_shape="ovr" * remove transfrom from tree test, minor fixes to tree tests * some fixes in the tests * undo changes in functions which still allow 1d input... * also allow 1d in scale * more test fixes... * last test fixes in forest and tree * svm default value change doctest failures * pep8 * remove more class_weight="auto" stuff * minor cosmetics in docstrings deprecated / removed behavior. * say that store_covariance has been moved to __init__ in discriminant_analysis
* remove stuff to be removed 0.19 * more changes * remove classes from 0.19 whatsnew * remove _LearntSelectorMixin * remove ProjectedGradientNMF, load_lwf_* * minor fixes * remove more copy from logistic regression path * remove lda, qda from __init__.__all__ * remove pg solver in nmf from tests etc * remove class_weight="auto" from tests * doctest change for decision_function_shape="ovr" * remove transfrom from tree test, minor fixes to tree tests * some fixes in the tests * undo changes in functions which still allow 1d input... * also allow 1d in scale * more test fixes... * last test fixes in forest and tree * svm default value change doctest failures * pep8 * remove more class_weight="auto" stuff * minor cosmetics in docstrings deprecated / removed behavior. * say that store_covariance has been moved to __init__ in discriminant_analysis
* remove stuff to be removed 0.19 * more changes * remove classes from 0.19 whatsnew * remove _LearntSelectorMixin * remove ProjectedGradientNMF, load_lwf_* * minor fixes * remove more copy from logistic regression path * remove lda, qda from __init__.__all__ * remove pg solver in nmf from tests etc * remove class_weight="auto" from tests * doctest change for decision_function_shape="ovr" * remove transfrom from tree test, minor fixes to tree tests * some fixes in the tests * undo changes in functions which still allow 1d input... * also allow 1d in scale * more test fixes... * last test fixes in forest and tree * svm default value change doctest failures * pep8 * remove more class_weight="auto" stuff * minor cosmetics in docstrings deprecated / removed behavior. * say that store_covariance has been moved to __init__ in discriminant_analysis
* remove stuff to be removed 0.19 * more changes * remove classes from 0.19 whatsnew * remove _LearntSelectorMixin * remove ProjectedGradientNMF, load_lwf_* * minor fixes * remove more copy from logistic regression path * remove lda, qda from __init__.__all__ * remove pg solver in nmf from tests etc * remove class_weight="auto" from tests * doctest change for decision_function_shape="ovr" * remove transfrom from tree test, minor fixes to tree tests * some fixes in the tests * undo changes in functions which still allow 1d input... * also allow 1d in scale * more test fixes... * last test fixes in forest and tree * svm default value change doctest failures * pep8 * remove more class_weight="auto" stuff * minor cosmetics in docstrings deprecated / removed behavior. * say that store_covariance has been moved to __init__ in discriminant_analysis
* remove stuff to be removed 0.19 * more changes * remove classes from 0.19 whatsnew * remove _LearntSelectorMixin * remove ProjectedGradientNMF, load_lwf_* * minor fixes * remove more copy from logistic regression path * remove lda, qda from __init__.__all__ * remove pg solver in nmf from tests etc * remove class_weight="auto" from tests * doctest change for decision_function_shape="ovr" * remove transfrom from tree test, minor fixes to tree tests * some fixes in the tests * undo changes in functions which still allow 1d input... * also allow 1d in scale * more test fixes... * last test fixes in forest and tree * svm default value change doctest failures * pep8 * remove more class_weight="auto" stuff * minor cosmetics in docstrings deprecated / removed behavior. * say that store_covariance has been moved to __init__ in discriminant_analysis
Early PR for CI.