Skip to content

Flip the sign of constraints in GPSampler#6213

Merged
nabenabe0928 merged 2 commits intooptuna:masterfrom
nabenabe0928:code-fix/flip-the-sign-of-constraints-in-gp
Aug 6, 2025
Merged

Flip the sign of constraints in GPSampler#6213
nabenabe0928 merged 2 commits intooptuna:masterfrom
nabenabe0928:code-fix/flip-the-sign-of-constraints-in-gp

Conversation

@nabenabe0928
Copy link
Copy Markdown
Contributor

@nabenabe0928 nabenabe0928 commented Jul 23, 2025

Motivation

Since GPSampler conventionally trains regressors on y to be maximized, we adapt constraints to this convention.

Description of the changes

  • Flip the sign of constraints
  • Flip the sign in log probability of improvement

Math Background

Suppose $p(x | \mu, \sigma^2)$ is the probability density function of the Gauss distribution $\mathcal{N}(\mu, \sigma^2)$.

The previous version (x is better when it is lower):
$\int_{-\infty}^{f_0} p(x | \mu, \sigma^2) dx = \int_{-\infty}^{\frac{f_0 - \mu}{\sigma}} p( t \coloneqq \frac{x - \mu}{\sigma} | 0, 1) dt$

This version (x is better when it is higher):
$\int_{f_0}^{\infty} p(x | \mu, \sigma^2) dx = \int_{\frac{f_0 - \mu}{\sigma}}^{\infty} p( t \coloneqq \frac{x - \mu}{\sigma} | 0, 1) dt = \int_{-\frac{f_0 - \mu}{\sigma}}^{-\infty} -p( u \coloneqq -t | 0, 1) du = \int_{-\infty}^{-\frac{f_0 - \mu}{\sigma}} p( u | 0, 1) du$

@nabenabe0928 nabenabe0928 added the code-fix Change that does not change the behavior, such as code refactoring. label Jul 23, 2025
@nabenabe0928 nabenabe0928 added this to the v4.5.0 milestone Jul 23, 2025
@nabenabe0928
Copy link
Copy Markdown
Contributor Author

@gen740 @kAIto47802
Could you review this PR?

@nabenabe0928
Copy link
Copy Markdown
Contributor Author

import numpy as np
import optuna


def objective(trial: optuna.Trial) -> float:
    x = trial.suggest_float("x", 0.0, 2 * np.pi)
    y = trial.suggest_float("y", 0.0, 2 * np.pi)
    c = float(np.sin(x) * np.sin(y) + 0.95)
    trial.set_user_attr("c", c)
    return float(np.sin(x) + y)


def constraints(trial: optuna.trial.FrozenTrial) -> tuple[float]:
    c = trial.user_attrs["c"]
    return (c, )


sampler = optuna.samplers.GPSampler(constraints_func=constraints, seed=42)
study = optuna.create_study(sampler=sampler)
study.optimize(objective, n_trials=30)

This code yielded the identical results to the master branch.

Copy link
Copy Markdown
Collaborator

@kAIto47802 kAIto47802 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the PR. LGTM!

@kAIto47802 kAIto47802 removed their assignment Jul 25, 2025
Copy link
Copy Markdown
Member

@gen740 gen740 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I confirmed the PR code is returning an identical result in the master branch.

LGTM!

@gen740 gen740 enabled auto-merge (rebase) August 1, 2025 04:31
@gen740
Copy link
Copy Markdown
Member

gen740 commented Aug 1, 2025

@nabenabe0928
I enabled the auto-merge. Could you resolve the conflicts?

auto-merge was automatically disabled August 1, 2025 06:08

Rebase failed

@nabenabe0928 nabenabe0928 merged commit 255fc3d into optuna:master Aug 6, 2025
14 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

code-fix Change that does not change the behavior, such as code refactoring.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants