Conversation
|
@y0z Could you review this PR? |
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## master #5218 +/- ##
==========================================
- Coverage 89.37% 89.12% -0.26%
==========================================
Files 206 213 +7
Lines 15097 14463 -634
==========================================
- Hits 13493 12890 -603
+ Misses 1604 1573 -31 ☔ View full report in Codecov by Sentry. |
|
I find that removing the XGBoost integration causes the following documentation generation error. make: *** [Makefile:20: html] Error 2
generating gallery for tutorial/10_key_features... [ 60%] 003_efficient_optimization_algorithms.py
Error: Process completed with exit code 2.To avoid this error, could you please change the tutorial code to use LightGBMPruningCallback instead of |
|
Thank you @y0z for the review and comment. I will look into fixing this bit with the documentation generation as you pointed out. |
|
Currently, the bottom of ###################################################################################################
# Integration Modules for Pruning
# -------------------------------
# To implement pruning mechanism in much simpler forms, Optuna provides integration modules for the following libraries.
#
# For the complete list of Optuna's integration modules, see :mod:`~optuna.integration`.
#
# For example, :class:`~optuna.integration.XGBoostPruningCallback` introduces pruning without directly changing the logic of training iteration.
# (See also `example <https://github.com/optuna/optuna-examples/tree/main/xgboost/xgboost_integration.py>`_ for the entire script.)
#
# .. code-block:: python
#
# pruning_callback = optuna.integration.XGBoostPruningCallback(trial, 'validation-error')
# bst = xgb.train(param, dtrain, evals=[(dvalid, 'validation')], callbacks=[pruning_callback])
I would like you to change it like: ###################################################################################################
# Integration Modules for Pruning
# -------------------------------
# To implement pruning mechanism in much simpler forms, Optuna provides integration modules for the following libraries.
#
# For the complete list of Optuna's integration modules, see :mod:`~optuna.integration`.
#
# For example, :class:`~optuna.integration.LightGBMPruningCallback` introduces pruning without directly changing the logic of training iteration.
# (See also `example <https://github.com/optuna/optuna-examples/blob/main/lightgbm/lightgbm_integration.py>`_ for the entire script.)
#
# .. code-block:: python
#
# pruning_callback = optuna.integration.LightGBMPruningCallback(trial, 'validation-error')
# gbm = lgb.train(param, dtrain, valid_sets=[dvalid], callbacks=[pruning_callback]) |
nabenabe0928
left a comment
There was a problem hiding this comment.
Please address this comment.
#5218 (comment)
Thank you for providing the desired changes. Edits completed, ptal. |
nabenabe0928
left a comment
There was a problem hiding this comment.
Thank you for your work:)
LGTM!
a9b7718 to
83d6b1a
Compare
|
@buruzaemon Could you leave a comment on #5238 so that I can include you as a contributor of this PR? |
Motivation
Description of the changes
REQUEST