Skip to content

Move fit_kernel_params to GPRegressor#6243

Merged
nabenabe0928 merged 8 commits intooptuna:masterfrom
nabenabe0928:code-fix/move-fit-kernel-params
Aug 29, 2025
Merged

Move fit_kernel_params to GPRegressor#6243
nabenabe0928 merged 8 commits intooptuna:masterfrom
nabenabe0928:code-fix/move-fit-kernel-params

Conversation

@nabenabe0928
Copy link
Copy Markdown
Contributor

@nabenabe0928 nabenabe0928 commented Aug 13, 2025

Motivation

This PR is necessary to cache the pair-wise distances of X_train, which speeds up GPSampler, done in:

Description of the changes

  • Move fit_kernel_params to GPRegressor

@nabenabe0928 nabenabe0928 added the code-fix Change that does not change the behavior, such as code refactoring. label Aug 13, 2025
@codecov
Copy link
Copy Markdown

codecov bot commented Aug 13, 2025

Codecov Report

❌ Patch coverage is 94.59459% with 2 lines in your changes missing coverage. Please review.
✅ Project coverage is 89.18%. Comparing base (255fc3d) to head (9638811).
⚠️ Report is 26 commits behind head on master.

Files with missing lines Patch % Lines
optuna/_gp/gp.py 94.59% 2 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #6243      +/-   ##
==========================================
+ Coverage   89.11%   89.18%   +0.07%     
==========================================
  Files         208      208              
  Lines       13796    13818      +22     
==========================================
+ Hits        12294    12324      +30     
+ Misses       1502     1494       -8     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@y0z
Copy link
Copy Markdown
Member

y0z commented Aug 20, 2025

@gen740 @kAIto47802 Could you review this PR?

@github-actions
Copy link
Copy Markdown
Contributor

This pull request has not seen any recent activity.

@github-actions github-actions bot added the stale Exempt from stale bot labeling. label Aug 27, 2025
@nabenabe0928 nabenabe0928 removed the stale Exempt from stale bot labeling. label Aug 28, 2025
Co-authored-by: kAIto47802 <115693559+kAIto47802@users.noreply.github.com>
@gen740
Copy link
Copy Markdown
Member

gen740 commented Aug 29, 2025

I used the following benchmark code (in #6244):

import optuna


def objective(trial: optuna.Trial) -> float:
    return sum(trial.suggest_float(f"x{i}", -5, 5)**2 for i in range(5))


sampler = optuna.samplers.GPSampler(seed=0)
study = optuna.create_study(sampler=sampler)
study.optimize(objective, n_trials=300)
print((study.trials[-1].datetime_complete - study.trials[0].datetime_start).total_seconds())

And I found that the master branch is much faster:

  • Master: 39.261927s
  • This PR: 55.45938s

I tested several times, but the results were consistent.

@nabenabe0928
Copy link
Copy Markdown
Contributor Author

@gen740
Is it probably because I haven't merged the upstream master?
I pushed the latest version, so could you please benchmark again?

@gen740
Copy link
Copy Markdown
Member

gen740 commented Aug 29, 2025

@nabenabe0928
Thank you! I reran the benchmark and confirmed that the performance looks identical.
I also used wfg4 and ran the multi-objective optimization benchmark (see the code below), but the performance is almost the same as on the master branch.

  • master: 143.247278
  • this pr: 142.377885
import optuna
import optunahub

wfg = optunahub.load_module("benchmarks/wfg")
wfg4 = wfg.Problem(function_id=4, n_objectives=4, dimension=8, k=6)

sampler = optuna.samplers.GPSampler(seed=0)
study = optuna.create_study(
    sampler=sampler,
    directions=wfg4.directions,
)

study.optimize(wfg4, n_trials=100)

print(
    (
        study.trials[-1].datetime_complete - study.trials[0].datetime_start
    ).total_seconds()
)

Copy link
Copy Markdown
Member

@gen740 gen740 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Co-authored-by: kAIto47802 <115693559+kAIto47802@users.noreply.github.com>
Copy link
Copy Markdown
Collaborator

@kAIto47802 kAIto47802 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the update! LGTM!

@nabenabe0928 nabenabe0928 enabled auto-merge August 29, 2025 09:28
@nabenabe0928 nabenabe0928 merged commit 43d714d into optuna:master Aug 29, 2025
12 of 14 checks passed
@nabenabe0928 nabenabe0928 added this to the v4.6.0 milestone Aug 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

code-fix Change that does not change the behavior, such as code refactoring.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants