Skip to content

Remove XGBoost integration#5218

Closed
buruzaemon wants to merge 0 commit intooptuna:masterfrom
buruzaemon:bugfix/remove-xgboost-integration
Closed

Remove XGBoost integration#5218
buruzaemon wants to merge 0 commit intooptuna:masterfrom
buruzaemon:bugfix/remove-xgboost-integration

Conversation

@buruzaemon
Copy link
Copy Markdown
Contributor

Motivation

Description of the changes

  • Removed unnecessary files and references to XGBoost.

REQUEST

@github-actions github-actions bot added the optuna.integration Related to the `optuna.integration` submodule. This is automatically labeled by github-actions. label Feb 1, 2024
@nabenabe0928
Copy link
Copy Markdown
Contributor

@y0z Could you review this PR?

@codecov
Copy link
Copy Markdown

codecov bot commented Feb 1, 2024

Codecov Report

Attention: 2 lines in your changes are missing coverage. Please review.

Comparison is base (53d8554) 89.37% compared to head (a9b7718) 89.12%.
Report is 96 commits behind head on master.

Files Patch % Lines
optuna/integration/xgboost.py 0.00% 2 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #5218      +/-   ##
==========================================
- Coverage   89.37%   89.12%   -0.26%     
==========================================
  Files         206      213       +7     
  Lines       15097    14463     -634     
==========================================
- Hits        13493    12890     -603     
+ Misses       1604     1573      -31     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@y0z
Copy link
Copy Markdown
Member

y0z commented Feb 6, 2024

I find that removing the XGBoost integration causes the following documentation generation error.

make: *** [Makefile:20: html] Error 2
generating gallery for tutorial/10_key_features... [ 60%] 003_efficient_optimization_algorithms.py
Error: Process completed with exit code 2.

To avoid this error, could you please change the tutorial code to use LightGBMPruningCallback instead of XGBoostPruningCallback? An example is here.

@buruzaemon
Copy link
Copy Markdown
Contributor Author

Thank you @y0z for the review and comment. I will look into fixing this bit with the documentation generation as you pointed out.

@nabenabe0928
Copy link
Copy Markdown
Contributor

nabenabe0928 commented Feb 7, 2024

@buruzaemon

Currently, the bottom of tutorial/10_key_features/003_efficient_optimization_algorithms.py looks like below:

###################################################################################################
# Integration Modules for Pruning
# -------------------------------
# To implement pruning mechanism in much simpler forms, Optuna provides integration modules for the following libraries.
#
# For the complete list of Optuna's integration modules, see :mod:`~optuna.integration`.
#
# For example, :class:`~optuna.integration.XGBoostPruningCallback` introduces pruning without directly changing the logic of training iteration.
# (See also `example <https://github.com/optuna/optuna-examples/tree/main/xgboost/xgboost_integration.py>`_ for the entire script.)
#
# .. code-block:: python
#
#         pruning_callback = optuna.integration.XGBoostPruningCallback(trial, 'validation-error')
#         bst = xgb.train(param, dtrain, evals=[(dvalid, 'validation')], callbacks=[pruning_callback])

I would like you to change it like:

###################################################################################################
# Integration Modules for Pruning
# -------------------------------
# To implement pruning mechanism in much simpler forms, Optuna provides integration modules for the following libraries.
#
# For the complete list of Optuna's integration modules, see :mod:`~optuna.integration`.
#
# For example, :class:`~optuna.integration.LightGBMPruningCallback` introduces pruning without directly changing the logic of training iteration.
# (See also `example <https://github.com/optuna/optuna-examples/blob/main/lightgbm/lightgbm_integration.py>`_ for the entire script.)
#
# .. code-block:: python
#
#         pruning_callback = optuna.integration.LightGBMPruningCallback(trial, 'validation-error')
#         gbm = lgb.train(param, dtrain, valid_sets=[dvalid], callbacks=[pruning_callback])

@nabenabe0928 nabenabe0928 removed their assignment Feb 7, 2024
Copy link
Copy Markdown
Contributor

@nabenabe0928 nabenabe0928 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please address this comment.
#5218 (comment)

@buruzaemon
Copy link
Copy Markdown
Contributor Author

Please address this comment. #5218 (comment)

Thank you for providing the desired changes. Edits completed, ptal.

Copy link
Copy Markdown
Contributor

@nabenabe0928 nabenabe0928 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for your work:)
LGTM!

Copy link
Copy Markdown
Member

@y0z y0z left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for your contribution.
LGTM 🎉

Since this branch has conflicts, please merge the master to this branch first to merge this PR.

@y0z y0z removed their assignment Feb 8, 2024
@buruzaemon buruzaemon closed this Feb 8, 2024
@buruzaemon buruzaemon force-pushed the bugfix/remove-xgboost-integration branch from a9b7718 to 83d6b1a Compare February 8, 2024 12:06
@nabenabe0928
Copy link
Copy Markdown
Contributor

@buruzaemon Could you leave a comment on #5238 so that I can include you as a contributor of this PR?

@y0z y0z mentioned this pull request Feb 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

optuna.integration Related to the `optuna.integration` submodule. This is automatically labeled by github-actions.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants