Skip to content

ENH Add custom loss support for HistGradientBoosting#16908

Merged
thomasjpfan merged 1 commit intoscikit-learn:masterfrom
gbolmier:histgradientboosting_custom_loss
Apr 15, 2020
Merged

ENH Add custom loss support for HistGradientBoosting#16908
thomasjpfan merged 1 commit intoscikit-learn:masterfrom
gbolmier:histgradientboosting_custom_loss

Conversation

@gbolmier
Copy link
Copy Markdown
Contributor

Reference Issues/PRs

Resolves: #15841

What does this implement/fix? Explain your changes.

Add custom loss support for HistGradientBoostingClassifier and
HistGradientBoostingRegressor as a private API without any
documentation. A BaseLoss object can now be passed a loss
parameter.

Add custom loss support for HistGradientBoostingClassifier and
HistGradientBoostingRegressor as a private API without any
documentation. A `BaseLoss` object can now be passed a loss
parameter.

Resolves: scikit-learn#15841
@gbolmier gbolmier changed the title Add custom loss support for HistGradientBoosting [MRG] Add custom loss support for HistGradientBoosting Apr 13, 2020
Copy link
Copy Markdown
Member

@NicolasHug NicolasHug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @gbolmier !

Copy link
Copy Markdown
Member

@thomasjpfan thomasjpfan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like the issue discussion is okay with making this private (for now).

LGTM

@thomasjpfan thomasjpfan changed the title [MRG] Add custom loss support for HistGradientBoosting ENH Add custom loss support for HistGradientBoosting Apr 15, 2020
@thomasjpfan thomasjpfan merged commit 9d366a4 into scikit-learn:master Apr 15, 2020
gio8tisu pushed a commit to gio8tisu/scikit-learn that referenced this pull request May 15, 2020
viclafargue pushed a commit to viclafargue/scikit-learn that referenced this pull request Jun 26, 2020
@gbolmier gbolmier deleted the histgradientboosting_custom_loss branch November 15, 2020 00:52
@Sandy4321
Copy link
Copy Markdown

@Sandy4321
Copy link
Copy Markdown

people really need it
https://stackoverflow.com/questions/54267745/implementing-custom-loss-function-in-scikit-learn

It's tricky, but you can do it...

  1. Open up your classifier. Let's use an RFC for example: https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.RandomForestClassifier.html

  2. click [source]

  3. See how it's inheriting from ForestClassifier? Right there in the class definition. Click that word to jump to it's parent definition.

  4. See how this new object is inheriting from ClassifierMixin? Click that.

  5. See how the bottom of that ClassifierMixin class says this?

from .metrics import accuracy_score
return accuracy_score(y, self.predict(X), sample_weight=sample_weight)
That's your model being trained on accuracy. You need to inject at this point if you want to train your model to be a "recall model" or a "precision model" or whatever model. This accuracy metric is baked into SKlearn. Some day, a better man than I will make this a parameter which models accept, however in the mean time, you gotta go into your sklearn installation, and tweak this accuracy_score to be whatever you want.

@NicolasHug
Copy link
Copy Markdown
Member

@Sandy4321 please see #15841 (comment). The instructions copied pasted from SO above are incorrect.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

HistGradientBoosting: Implement custom loss function like LightGBM permits it

4 participants