Skip to content

[MRG+1] Housekeeping Deprecations for v0.19 #7927

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 22 commits into from
Dec 9, 2016

Conversation

amueller
Copy link
Member

Early PR for CI.

@amueller
Copy link
Member Author

This again convinces me that #7618 would be a good idea (as default behavior even). The SVC changes a default parameter and I have to change doctests all over the project. ?!

@jnothman
Copy link
Member

You're right. We should make more changes to public interfaces to save your maintenance effort :P

@amueller
Copy link
Member Author

@jnothman you consider __repr__ and interface?

Also I just really really dislike our current __repr__. It's so useless and produces so much noise.

@jnothman
Copy link
Member

I was only joking here. I'm coming to agree with you that we should change the default __repr__, though I still think it has great value is in showing users what parameters they might want to set. Another thing that we could do to improve our __repr__ might be ordering parameters by their order in signature or documentation rather than alphabetic...

@amueller amueller changed the title [WIP] Housekeeping Deprecations 019 [MRG] Housekeeping Deprecations 019 Nov 29, 2016
@amueller
Copy link
Member Author

Should be good now.

@raghavrv raghavrv changed the title [MRG] Housekeeping Deprecations 019 [MRG] Housekeeping Deprecations for v0.19 Dec 2, 2016
@raghavrv raghavrv added this to the 0.19 milestone Dec 2, 2016
@raghavrv
Copy link
Member

raghavrv commented Dec 2, 2016

Can we decide to release 0.19 by early April 2017? We have ~4 months...

@amueller
Copy link
Member Author

amueller commented Dec 2, 2016

Yeah we should release April 1st and announce that fit will no longer return self ;)

But sounds good for a target date. If we can make it depends obviously on a lot of factors (I'll be teaching, for example).

Copy link
Member

@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, class_weight auto should be removed from sklearn/linear_model/stochastic_gradient.py:498

Some docstrings in ..versionadded/..versionchanged reference the old deprecated value. Should we remove the trace of the old functionality? This affects 'auto' class_weight, and scaler's *scale_* is recommended instead of deprecated *std_*

Otherwise I think this is good.

@@ -1360,23 +1360,6 @@ Low-level methods
Recently deprecated
===================

To be removed in 0.19
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ProjectedGradientNMF was missing here.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we list them again in the changelog? Or is that overkill?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure what you're proposing, but I assume it's overkill :P

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean "should 0.19 list everything that was removed". It might be useful but maybe not necessarily part of this PR ;) Do I have your +1?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't worry about it. Yes, you have my +1. I can't be bothered going through the details a second time!

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Who's the dodgy one now ;)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm relying on the second reviewer being less dodgy??

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Look, I think I did a pretty good job of catching some omissions here...

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

True ;)

@@ -425,20 +425,6 @@ def fit(self, X, y, store_covariance=None, tol=None):
y : array, shape (n_samples,)
Target values.
"""
if store_covariance:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Investigating any leftovers from this, I've discovered that QDA has store_covariances unlike LDA's store_covariance :((

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

open issue?

@@ -48,25 +47,16 @@ def compute_class_weight(class_weight, classes, y):
if class_weight is None or len(class_weight) == 0:
# uniform class weights
weight = np.ones(classes.shape[0], dtype=np.float64, order='C')
elif class_weight in ['auto', 'balanced']:
elif class_weight == 'balanced':
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are still two uses of 'auto' in compute_sample_weight and 1 more in its documentation.

@amueller
Copy link
Member Author

amueller commented Dec 7, 2016

@raghavrv wanna review? hm.. who else? @lesteve ? ;)

@jnothman
Copy link
Member

jnothman commented Dec 7, 2016

LGTM

@jnothman jnothman changed the title [MRG] Housekeeping Deprecations for v0.19 [MRG+1] Housekeeping Deprecations for v0.19 Dec 7, 2016
@@ -425,20 +425,6 @@ def fit(self, X, y, store_covariance=None, tol=None):
y : array, shape (n_samples,)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a niptick but don't you need to change l414 - l418 to say that store_covariance and tol have been removed as you've done here ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't actually have a policy on that, do we? But yeah, would be better I guess.

if tree in SPARSE_TREES:
yield (check_sparse_input, tree, dataset, 2)
for tree_type, dataset in product(SPARSE_TREES, ["boston", "reg_small"]):
if tree_type in REG_TREES:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand why you change that ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

because tree overwrites the tree module that was imported. And I (and flake8) don't like overwriting file-level module imports with local variable names.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good for me :)

@tguillemot
Copy link
Contributor

Do we have to update what's new ?
Anyway LGTM

@amueller
Copy link
Member Author

amueller commented Dec 8, 2016

done

@tguillemot
Copy link
Contributor

tguillemot commented Dec 9, 2016

It's ok for me :)
Thanks @amueller

@amueller
Copy link
Member Author

amueller commented Dec 9, 2016

thanks for the reviews :)

@raghavrv
Copy link
Member

raghavrv commented Dec 9, 2016

Good bye 1200 lines of code 🎉

sergeyf pushed a commit to sergeyf/scikit-learn that referenced this pull request Feb 28, 2017
* remove stuff to be removed 0.19

* more changes

* remove classes from 0.19 whatsnew

* remove _LearntSelectorMixin

* remove ProjectedGradientNMF, load_lwf_*

* minor fixes

* remove more copy from logistic regression path

* remove lda, qda from __init__.__all__

* remove pg solver in nmf from tests etc

* remove class_weight="auto" from tests

* doctest change for decision_function_shape="ovr"

* remove transfrom from tree test, minor fixes to tree tests

* some fixes in the tests

* undo changes in functions which still allow 1d input...

* also allow 1d in scale

* more test fixes...

* last test fixes in forest and tree

* svm default value change doctest failures

* pep8

* remove more class_weight="auto" stuff

* minor cosmetics in docstrings deprecated / removed behavior.

* say that store_covariance has been moved to __init__ in discriminant_analysis
Sundrique pushed a commit to Sundrique/scikit-learn that referenced this pull request Jun 14, 2017
* remove stuff to be removed 0.19

* more changes

* remove classes from 0.19 whatsnew

* remove _LearntSelectorMixin

* remove ProjectedGradientNMF, load_lwf_*

* minor fixes

* remove more copy from logistic regression path

* remove lda, qda from __init__.__all__

* remove pg solver in nmf from tests etc

* remove class_weight="auto" from tests

* doctest change for decision_function_shape="ovr"

* remove transfrom from tree test, minor fixes to tree tests

* some fixes in the tests

* undo changes in functions which still allow 1d input...

* also allow 1d in scale

* more test fixes...

* last test fixes in forest and tree

* svm default value change doctest failures

* pep8

* remove more class_weight="auto" stuff

* minor cosmetics in docstrings deprecated / removed behavior.

* say that store_covariance has been moved to __init__ in discriminant_analysis
NelleV pushed a commit to NelleV/scikit-learn that referenced this pull request Aug 11, 2017
* remove stuff to be removed 0.19

* more changes

* remove classes from 0.19 whatsnew

* remove _LearntSelectorMixin

* remove ProjectedGradientNMF, load_lwf_*

* minor fixes

* remove more copy from logistic regression path

* remove lda, qda from __init__.__all__

* remove pg solver in nmf from tests etc

* remove class_weight="auto" from tests

* doctest change for decision_function_shape="ovr"

* remove transfrom from tree test, minor fixes to tree tests

* some fixes in the tests

* undo changes in functions which still allow 1d input...

* also allow 1d in scale

* more test fixes...

* last test fixes in forest and tree

* svm default value change doctest failures

* pep8

* remove more class_weight="auto" stuff

* minor cosmetics in docstrings deprecated / removed behavior.

* say that store_covariance has been moved to __init__ in discriminant_analysis
paulha pushed a commit to paulha/scikit-learn that referenced this pull request Aug 19, 2017
* remove stuff to be removed 0.19

* more changes

* remove classes from 0.19 whatsnew

* remove _LearntSelectorMixin

* remove ProjectedGradientNMF, load_lwf_*

* minor fixes

* remove more copy from logistic regression path

* remove lda, qda from __init__.__all__

* remove pg solver in nmf from tests etc

* remove class_weight="auto" from tests

* doctest change for decision_function_shape="ovr"

* remove transfrom from tree test, minor fixes to tree tests

* some fixes in the tests

* undo changes in functions which still allow 1d input...

* also allow 1d in scale

* more test fixes...

* last test fixes in forest and tree

* svm default value change doctest failures

* pep8

* remove more class_weight="auto" stuff

* minor cosmetics in docstrings deprecated / removed behavior.

* say that store_covariance has been moved to __init__ in discriminant_analysis
maskani-moh pushed a commit to maskani-moh/scikit-learn that referenced this pull request Nov 15, 2017
* remove stuff to be removed 0.19

* more changes

* remove classes from 0.19 whatsnew

* remove _LearntSelectorMixin

* remove ProjectedGradientNMF, load_lwf_*

* minor fixes

* remove more copy from logistic regression path

* remove lda, qda from __init__.__all__

* remove pg solver in nmf from tests etc

* remove class_weight="auto" from tests

* doctest change for decision_function_shape="ovr"

* remove transfrom from tree test, minor fixes to tree tests

* some fixes in the tests

* undo changes in functions which still allow 1d input...

* also allow 1d in scale

* more test fixes...

* last test fixes in forest and tree

* svm default value change doctest failures

* pep8

* remove more class_weight="auto" stuff

* minor cosmetics in docstrings deprecated / removed behavior.

* say that store_covariance has been moved to __init__ in discriminant_analysis
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants