Skip to content

DOC add missing attributes in GradientBoostingRegressor#17671

Merged
glemaitre merged 3 commits intoscikit-learn:masterfrom
simonamaggio:gboost_regr
Jun 24, 2020
Merged

DOC add missing attributes in GradientBoostingRegressor#17671
glemaitre merged 3 commits intoscikit-learn:masterfrom
simonamaggio:gboost_regr

Conversation

@simonamaggio
Copy link
Copy Markdown
Contributor

Fixes part of #14312 for GradientBoostingRegressor estimator.

Added n_classes_ and n_estimators_ to docstring in GradientBoostingRegressor.
GradientBoostingRegressor removed from the list of estimators in test_docstring_parameters.py.

Copy link
Copy Markdown
Member

@glemaitre glemaitre left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You need to merge master into your branch because of the conflict.

``n_iter_no_change`` is specified). Otherwise it is set to
``n_estimators``.

.. versionadded:: 0.20
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can remove the version added

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

estimators_ : ndarray of DecisionTreeRegressor of shape (n_estimators, 1)
The collection of fitted sub-estimators.

n_classes_ : int
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Uhm it does not make any sense to have this attribute indeed.
I believed that we did deprecate this in the tree at some point.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So yes we still have this apparently. So let's keep it because we need to deprecate it in another PR.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, it doesn't make sense. But it is used from the underlying BaseGradientBoosting and used as a flag regression/classification to make different choices of some parameters, for instance max_features. Wha would you suggest?

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would suggest adding it as you did.

Then, we should open a new PR to solve it (I open an issue associated with it #17673). The idea will be to deprecate the attribute (and we have to add this in the docstring). We have several ways to detect that we are in regression:

  • use private attribute self._n_classes instead of public one and expose self.n_classes_ only in the classifier.
  • use is_regressor to check if we should create this variable;
  • or maybe something even better than we have to check.

@glemaitre glemaitre merged commit 683b3e5 into scikit-learn:master Jun 24, 2020
@glemaitre
Copy link
Copy Markdown
Member

Thanks @simonamaggio

rubywerman pushed a commit to MLH-Fellowship/scikit-learn that referenced this pull request Jun 24, 2020
jayzed82 pushed a commit to jayzed82/scikit-learn that referenced this pull request Oct 22, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants