API Deprecate positional arguments in preprocessing#16996
API Deprecate positional arguments in preprocessing#16996adrinjalali merged 4 commits intoscikit-learn:masterfrom
Conversation
sklearn/preprocessing/_data.py
Outdated
|
|
||
| def __init__(self, feature_range=(0, 1), copy=True): | ||
| @_deprecate_positional_args | ||
| def __init__(self, *, feature_range=(0, 1), copy=True): |
There was a problem hiding this comment.
I'm happy with feature_range positional or not.
There was a problem hiding this comment.
Positional isn't ambiguous I would say so +1 for that
sklearn/preprocessing/_data.py
Outdated
|
|
||
| def normalize(X, norm='l2', axis=1, copy=True, return_norm=False): | ||
| @_deprecate_positional_args | ||
| def normalize(X, *, norm='l2', axis=1, copy=True, return_norm=False): |
There was a problem hiding this comment.
I'd put norm positional if we're doing positional norm in Normalizer?
There was a problem hiding this comment.
yes, also the names ("l2", etc) are descriptive enough
sklearn/preprocessing/_label.py
Outdated
|
|
||
| def label_binarize(y, classes, neg_label=0, pos_label=1, sparse_output=False): | ||
| @_deprecate_positional_args | ||
| def label_binarize(y, classes, *, neg_label=0, pos_label=1, |
There was a problem hiding this comment.
should we be consistent with classes here and MultiLabelBinarizer?
There was a problem hiding this comment.
Went with making classes positional for now (since it is a smaller diff). But I am okay with either.
There was a problem hiding this comment.
Hmm I think I prefer using classes as a keyword only option. Lets see what it looks like.
sklearn/preprocessing/_data.py
Outdated
|
|
||
| def power_transform(X, method='yeo-johnson', standardize=True, copy=True): | ||
| @_deprecate_positional_args | ||
| def power_transform(X, *, method='yeo-johnson', standardize=True, copy=True): |
There was a problem hiding this comment.
You've not decorated the class version of this. I'm happy for methods to be positional.,..?
There was a problem hiding this comment.
Updated PR with the class version of this.
I'm happy for methods to be positional.,..?
I am also okay with keeping methods positional. We kind of started with making functions positional in the earlier PRs, are we okay with continuing with this?
sklearn/preprocessing/_data.py
Outdated
|
|
||
| def normalize(X, norm='l2', axis=1, copy=True, return_norm=False): | ||
| @_deprecate_positional_args | ||
| def normalize(X, *, norm='l2', axis=1, copy=True, return_norm=False): |
There was a problem hiding this comment.
yes, also the names ("l2", etc) are descriptive enough
sklearn/preprocessing/_data.py
Outdated
|
|
||
| def __init__(self, n_quantiles=1000, output_distribution='uniform', | ||
| @_deprecate_positional_args | ||
| def __init__(self, n_quantiles=1000, *, output_distribution='uniform', |
There was a problem hiding this comment.
n_quantiles is kwonly in quantile_transform (arguably because of the axis param that comes before). Should it be kwonly here too?
There was a problem hiding this comment.
Yea its because of the axis. I'll make it keyword only here as well.
sklearn/preprocessing/_data.py
Outdated
|
|
||
| def power_transform(X, method='yeo-johnson', standardize=True, copy=True): | ||
| @_deprecate_positional_args | ||
| def power_transform(X, *, method='yeo-johnson', standardize=True, copy=True): |
There was a problem hiding this comment.
here again the names 'yeo-johnson' and 'boxcox' are non-ambiguous and would lead me to just not force a kwonly but no strong opinion
| def __init__(self, func=None, inverse_func=None, validate=False, | ||
|
|
||
| @_deprecate_positional_args | ||
| def __init__(self, func=None, *, inverse_func=None, validate=False, |
There was a problem hiding this comment.
| def __init__(self, func=None, *, inverse_func=None, validate=False, | |
| def __init__(self, func=None, inverse_func=None, *, validate=False, |
I assume usage would be non-ambiguous
sklearn/preprocessing/_data.py
Outdated
|
|
||
| def __init__(self, feature_range=(0, 1), copy=True): | ||
| @_deprecate_positional_args | ||
| def __init__(self, *, feature_range=(0, 1), copy=True): |
There was a problem hiding this comment.
Positional isn't ambiguous I would say so +1 for that
sklearn/preprocessing/_data.py
Outdated
|
|
||
| def minmax_scale(X, feature_range=(0, 1), axis=0, copy=True): | ||
| @_deprecate_positional_args | ||
| def minmax_scale(X, *, feature_range=(0, 1), axis=0, copy=True): |
There was a problem hiding this comment.
should this follow the MinMaxScaler and have feature_range positional?
* API Deprecate positional arguments in preprocessing * CLN Address comments * CLN Uses classes as a keyword only argument * BUG Fix
* API Deprecate positional arguments in preprocessing * CLN Address comments * CLN Uses classes as a keyword only argument * BUG Fix
Reference Issues/PRs
Toward #15005