Skip to content

Fix the default value of max_resource to HyperbandPruner#1171

Merged
hvy merged 21 commits intooptuna:masterfrom
HideakiImamura:default-max-resource
May 8, 2020
Merged

Fix the default value of max_resource to HyperbandPruner#1171
hvy merged 21 commits intooptuna:masterfrom
HideakiImamura:default-max-resource

Conversation

@HideakiImamura
Copy link
Copy Markdown
Member

@HideakiImamura HideakiImamura commented Apr 27, 2020

Depends #1168

Motivation

It is difficult for users to determine the value of arguments for HyperbandPruner. This PR aims to give the default value of max_resource to HyperbandPruner

Description of the changes

  • Give the default value auto for max_resource of HyperbandPruner.
  • If max_resource = 'auto', the pruner waits for the completion of some trials and set max_resource = the amount of resources those trials used.
  • Add a test for verifying that the auto for max_resource works.

@HideakiImamura HideakiImamura added the optuna.pruners Related to the `optuna.pruners` submodule. This is automatically labeled by github-actions. label Apr 27, 2020
@codecov-io
Copy link
Copy Markdown

codecov-io commented Apr 28, 2020

Codecov Report

Merging #1171 into master will increase coverage by 0.15%.
The diff coverage is 96.29%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #1171      +/-   ##
==========================================
+ Coverage   91.06%   91.21%   +0.15%     
==========================================
  Files         142      142              
  Lines       12286    12307      +21     
==========================================
+ Hits        11188    11226      +38     
+ Misses       1098     1081      -17     
Impacted Files Coverage Δ
tests/pruners_tests/test_hyperband.py 97.01% <92.85%> (-1.10%) ⬇️
optuna/pruners/hyperband.py 97.87% <97.50%> (+5.11%) ⬆️
...ration_tests/lightgbm_tuner_tests/test_optimize.py 99.44% <0.00%> (-0.01%) ⬇️
optuna/logging.py 93.54% <0.00%> (ø)
tests/integration_tests/test_keras.py 100.00% <0.00%> (ø)
optuna/integration/keras.py 78.12% <0.00%> (+2.26%) ⬆️
optuna/integration/lightgbm_tuner/optimize.py 86.85% <0.00%> (+3.66%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update dbdba52...9a18282. Read the comment docs.

@hvy hvy self-assigned this Apr 28, 2020
@hvy hvy added the enhancement Change that does not break compatibility and not affect public interfaces, but improves performance. label Apr 28, 2020
Copy link
Copy Markdown
Contributor

@crcrpar crcrpar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you update the title?

max_resource now defaults to 80.

@HideakiImamura HideakiImamura changed the title Add default value of max_resource to HyperbandPruner Fix the default value of max_resource to HyperbandPruner Apr 30, 2020
Copy link
Copy Markdown
Member

@hvy hvy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR, this is a heuristic that simplifies the usage for many. I left some early comments!

Copy link
Copy Markdown
Contributor

@crcrpar crcrpar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This basically LGTM.

Copy link
Copy Markdown
Member

@hvy hvy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As we're delaying the initialization of the ASHA pruners, trial(s) used to determine the maximum resource won't take part in the promotion process since this trial won't have its system_attr updated with its rungs. Is this intentional? It might be nontrivial to fix though.

I also noticed that the guessed maximum resource could vary between processes in a distributed environment but this might be rare and could maybe be addressed separately. It's also an issue with the minimum resource guess in our ASHA. If my understanding is correct, I could create a separate issue addressing these problems (since they're quite similar).

@HideakiImamura
Copy link
Copy Markdown
Member Author

@hvy Thank you for your insightful comments.

As we're delaying the initialization of the ASHA pruners, trial(s) used to determine the maximum resource won't take part in the promotion process since this trial won't have its system_attr updated with its rungs. Is this intentional? It might be nontrivial to fix though.

Yes, it is intentional. The trial(s) to determine max_resource won't be considered in SuccessiveHalvingPruner. The number of such trials is at most the number of parallelized processes, which is smaller than the total number of trials. So I think this does not matter.

I also noticed that the guessed maximum resource could vary between processes in a distributed environment but this might be rare and could maybe be addressed separately. It's also an issue with the minimum resource guess in our ASHA. If my understanding is correct, I could create a separate issue addressing these problems (since they're quite similar).

Yes, I totally agree with you. Both the current HyperbandPruner and SuccessiveHalvingPruner have same issue as you pointed. Could you make an issue about that?

@hvy
Copy link
Copy Markdown
Member

hvy commented May 8, 2020

Sure, I'll create issues after merging this PR.

Copy link
Copy Markdown
Member

@hvy hvy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like to follow up this PR with issues described #1171 (comment) but changes otherwise LGTM!

@hvy hvy added this to the v1.4.0 milestone May 8, 2020
@hvy hvy merged commit fa7f16f into optuna:master May 8, 2020
@HideakiImamura HideakiImamura deleted the default-max-resource branch May 18, 2021 04:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement Change that does not break compatibility and not affect public interfaces, but improves performance. optuna.pruners Related to the `optuna.pruners` submodule. This is automatically labeled by github-actions.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants