[MRG+1] Housekeeping Deprecations for v0.19#7927
[MRG+1] Housekeeping Deprecations for v0.19#7927amueller merged 22 commits intoscikit-learn:masterfrom
Conversation
|
This again convinces me that #7618 would be a good idea (as default behavior even). The SVC changes a default parameter and I have to change doctests all over the project. ?! |
|
You're right. We should make more changes to public interfaces to save your maintenance effort :P |
|
@jnothman you consider Also I just really really dislike our current |
|
I was only joking here. I'm coming to agree with you that we should change the default |
|
Should be good now. |
|
Can we decide to release 0.19 by early April 2017? We have ~4 months... |
|
Yeah we should release April 1st and announce that But sounds good for a target date. If we can make it depends obviously on a lot of factors (I'll be teaching, for example). |
jnothman
left a comment
There was a problem hiding this comment.
Also, class_weight auto should be removed from sklearn/linear_model/stochastic_gradient.py:498
Some docstrings in ..versionadded/..versionchanged reference the old deprecated value. Should we remove the trace of the old functionality? This affects 'auto' class_weight, and scaler's *scale_* is recommended instead of deprecated *std_*
Otherwise I think this is good.
| Recently deprecated | ||
| =================== | ||
|
|
||
| To be removed in 0.19 |
There was a problem hiding this comment.
ProjectedGradientNMF was missing here.
There was a problem hiding this comment.
Should we list them again in the changelog? Or is that overkill?
There was a problem hiding this comment.
Not sure what you're proposing, but I assume it's overkill :P
There was a problem hiding this comment.
I mean "should 0.19 list everything that was removed". It might be useful but maybe not necessarily part of this PR ;) Do I have your +1?
There was a problem hiding this comment.
Don't worry about it. Yes, you have my +1. I can't be bothered going through the details a second time!
There was a problem hiding this comment.
Who's the dodgy one now ;)
There was a problem hiding this comment.
I'm relying on the second reviewer being less dodgy??
There was a problem hiding this comment.
Look, I think I did a pretty good job of catching some omissions here...
| y : array, shape (n_samples,) | ||
| Target values. | ||
| """ | ||
| if store_covariance: |
There was a problem hiding this comment.
Investigating any leftovers from this, I've discovered that QDA has store_covariances unlike LDA's store_covariance :((
| # uniform class weights | ||
| weight = np.ones(classes.shape[0], dtype=np.float64, order='C') | ||
| elif class_weight in ['auto', 'balanced']: | ||
| elif class_weight == 'balanced': |
There was a problem hiding this comment.
There are still two uses of 'auto' in compute_sample_weight and 1 more in its documentation.
|
LGTM |
| @@ -425,20 +425,6 @@ def fit(self, X, y, store_covariance=None, tol=None): | |||
| y : array, shape (n_samples,) | |||
There was a problem hiding this comment.
Just a niptick but don't you need to change l414 - l418 to say that store_covariance and tol have been removed as you've done here ?
There was a problem hiding this comment.
We don't actually have a policy on that, do we? But yeah, would be better I guess.
| if tree in SPARSE_TREES: | ||
| yield (check_sparse_input, tree, dataset, 2) | ||
| for tree_type, dataset in product(SPARSE_TREES, ["boston", "reg_small"]): | ||
| if tree_type in REG_TREES: |
There was a problem hiding this comment.
I don't understand why you change that ?
There was a problem hiding this comment.
because tree overwrites the tree module that was imported. And I (and flake8) don't like overwriting file-level module imports with local variable names.
|
Do we have to update what's new ? |
caed135 to
af49979
Compare
|
done |
|
It's ok for me :) |
|
thanks for the reviews :) |
|
Good bye 1200 lines of code 🎉 |
* remove stuff to be removed 0.19 * more changes * remove classes from 0.19 whatsnew * remove _LearntSelectorMixin * remove ProjectedGradientNMF, load_lwf_* * minor fixes * remove more copy from logistic regression path * remove lda, qda from __init__.__all__ * remove pg solver in nmf from tests etc * remove class_weight="auto" from tests * doctest change for decision_function_shape="ovr" * remove transfrom from tree test, minor fixes to tree tests * some fixes in the tests * undo changes in functions which still allow 1d input... * also allow 1d in scale * more test fixes... * last test fixes in forest and tree * svm default value change doctest failures * pep8 * remove more class_weight="auto" stuff * minor cosmetics in docstrings deprecated / removed behavior. * say that store_covariance has been moved to __init__ in discriminant_analysis
* remove stuff to be removed 0.19 * more changes * remove classes from 0.19 whatsnew * remove _LearntSelectorMixin * remove ProjectedGradientNMF, load_lwf_* * minor fixes * remove more copy from logistic regression path * remove lda, qda from __init__.__all__ * remove pg solver in nmf from tests etc * remove class_weight="auto" from tests * doctest change for decision_function_shape="ovr" * remove transfrom from tree test, minor fixes to tree tests * some fixes in the tests * undo changes in functions which still allow 1d input... * also allow 1d in scale * more test fixes... * last test fixes in forest and tree * svm default value change doctest failures * pep8 * remove more class_weight="auto" stuff * minor cosmetics in docstrings deprecated / removed behavior. * say that store_covariance has been moved to __init__ in discriminant_analysis
* remove stuff to be removed 0.19 * more changes * remove classes from 0.19 whatsnew * remove _LearntSelectorMixin * remove ProjectedGradientNMF, load_lwf_* * minor fixes * remove more copy from logistic regression path * remove lda, qda from __init__.__all__ * remove pg solver in nmf from tests etc * remove class_weight="auto" from tests * doctest change for decision_function_shape="ovr" * remove transfrom from tree test, minor fixes to tree tests * some fixes in the tests * undo changes in functions which still allow 1d input... * also allow 1d in scale * more test fixes... * last test fixes in forest and tree * svm default value change doctest failures * pep8 * remove more class_weight="auto" stuff * minor cosmetics in docstrings deprecated / removed behavior. * say that store_covariance has been moved to __init__ in discriminant_analysis
* remove stuff to be removed 0.19 * more changes * remove classes from 0.19 whatsnew * remove _LearntSelectorMixin * remove ProjectedGradientNMF, load_lwf_* * minor fixes * remove more copy from logistic regression path * remove lda, qda from __init__.__all__ * remove pg solver in nmf from tests etc * remove class_weight="auto" from tests * doctest change for decision_function_shape="ovr" * remove transfrom from tree test, minor fixes to tree tests * some fixes in the tests * undo changes in functions which still allow 1d input... * also allow 1d in scale * more test fixes... * last test fixes in forest and tree * svm default value change doctest failures * pep8 * remove more class_weight="auto" stuff * minor cosmetics in docstrings deprecated / removed behavior. * say that store_covariance has been moved to __init__ in discriminant_analysis
* remove stuff to be removed 0.19 * more changes * remove classes from 0.19 whatsnew * remove _LearntSelectorMixin * remove ProjectedGradientNMF, load_lwf_* * minor fixes * remove more copy from logistic regression path * remove lda, qda from __init__.__all__ * remove pg solver in nmf from tests etc * remove class_weight="auto" from tests * doctest change for decision_function_shape="ovr" * remove transfrom from tree test, minor fixes to tree tests * some fixes in the tests * undo changes in functions which still allow 1d input... * also allow 1d in scale * more test fixes... * last test fixes in forest and tree * svm default value change doctest failures * pep8 * remove more class_weight="auto" stuff * minor cosmetics in docstrings deprecated / removed behavior. * say that store_covariance has been moved to __init__ in discriminant_analysis
Early PR for CI.