-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
Fix elasticnect cv sample weight #29442
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix elasticnect cv sample weight #29442
Conversation
It seems like this single call to _preprocess_data suffices in all cases.
This tiny example was given in scikit-learn#22914. The test merely asserts that alpha_max is large enough to force the coefficient to 0.
As per reviewer's suggestions: (1) Clarify eps=1. (2) Parameterize `fit_intercept`.
(1) Give the name `n_samples` to the quantity `X.shape[0]`. (2) Clarify that `y_offset` and `X_scale` are not used, since these are already applied to the data by `_preprocess_data`.
Co-authored-by: Olivier Grisel <olivier.grisel@ensta.org>
…ied alpha_grid_ to accommodate for MultitaskCV y shape
…ear_model/tests/test_coordinate_descent/test_enet_cv_sample_weight_correctness
test_enet_cv_sample_weight_correctness
test_lasso_cv test_enet_alpha_max_sample_weight test_enet_cv_sample_weight_correctness
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A final pass of feedback. Once addressed, LGTM for merge.
Similarly to def _more_tags(self):
# Note: check_sample_weights_invariance(kind='ones') should work, but
# currently we can only mark a whole test as xfail.
return {
"_xfail_checks": {
"check_sample_weights_invariance": (
"zero sample_weight is not equivalent to removing samples"
),
}
} but this cannot be removed without changing |
Co-authored-by: Olivier Grisel <olivier.grisel@ensta.org>
@snath-xoc I pushed a quick fix for the linting problem that resulted from the merge of one of my github-edited suggestion. The following comment still need be addressed before final review / merge: |
@snath-xoc for information I pushed 2b99c9b to address the remaining unaddressed review comment (#29442 (comment)). This PR LGTM. |
Ooops, I did not re-run the test with all admissible random seeds... Let me fix this. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks !
Co-authored-by: Mr. Snrub <45150804+s-banach@users.noreply.github.com> Co-authored-by: Olivier Grisel <olivier.grisel@ensta.org> Co-authored-by: Shruti Nath <shrutinath@Shrutis-Laptop.local>
Fixes #22914
What does this implement/fix? Explain your changes.
Adapted from the pull request #23045 by s-banach and #29308 from snath-xoc
Modifies _alpha_grid function in linear_model._coordinate_descent to accept a sample_weight argument and implements changes to be compatible with _preprocess_data
TODO