Fix elasticnect cv sample weight#29442
Merged
jeremiedbb merged 39 commits intoscikit-learn:mainfrom Sep 9, 2024
Merged
Conversation
It seems like this single call to _preprocess_data suffices in all cases.
This tiny example was given in scikit-learn#22914. The test merely asserts that alpha_max is large enough to force the coefficient to 0.
As per reviewer's suggestions: (1) Clarify eps=1. (2) Parameterize `fit_intercept`.
(1) Give the name `n_samples` to the quantity `X.shape[0]`. (2) Clarify that `y_offset` and `X_scale` are not used, since these are already applied to the data by `_preprocess_data`.
Co-authored-by: Olivier Grisel <olivier.grisel@ensta.org>
…ied alpha_grid_ to accommodate for MultitaskCV y shape
…ear_model/tests/test_coordinate_descent/test_enet_cv_sample_weight_correctness
1 task
test_enet_cv_sample_weight_correctness
ogrisel
reviewed
Aug 21, 2024
test_lasso_cv test_enet_alpha_max_sample_weight test_enet_cv_sample_weight_correctness
ogrisel
approved these changes
Sep 4, 2024
Member
ogrisel
left a comment
There was a problem hiding this comment.
A final pass of feedback. Once addressed, LGTM for merge.
Member
|
Similarly to def _more_tags(self):
# Note: check_sample_weights_invariance(kind='ones') should work, but
# currently we can only mark a whole test as xfail.
return {
"_xfail_checks": {
"check_sample_weights_invariance": (
"zero sample_weight is not equivalent to removing samples"
),
}
}but this cannot be removed without changing |
54 tasks
Co-authored-by: Olivier Grisel <olivier.grisel@ensta.org>
Member
|
@snath-xoc I pushed a quick fix for the linting problem that resulted from the merge of one of my github-edited suggestion. The following comment still need be addressed before final review / merge: |
jeremiedbb
reviewed
Sep 6, 2024
ogrisel
reviewed
Sep 6, 2024
Member
|
@snath-xoc for information I pushed 2b99c9b to address the remaining unaddressed review comment (#29442 (comment)). This PR LGTM. |
Member
|
Ooops, I did not re-run the test with all admissible random seeds... Let me fix this. |
jeremiedbb
reviewed
Sep 9, 2024
1 task
MarcBresson
pushed a commit
to MarcBresson/scikit-learn
that referenced
this pull request
Sep 19, 2024
Co-authored-by: Mr. Snrub <45150804+s-banach@users.noreply.github.com> Co-authored-by: Olivier Grisel <olivier.grisel@ensta.org> Co-authored-by: Shruti Nath <shrutinath@Shrutis-Laptop.local>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Fixes #22914
What does this implement/fix? Explain your changes.
Adapted from the pull request #23045 by s-banach and #29308 from snath-xoc
Modifies _alpha_grid function in linear_model._coordinate_descent to accept a sample_weight argument and implements changes to be compatible with _preprocess_data
TODO