Skip to content

Hyperparameter optimization over Incremental-wrapped models #195

@TomAugspurger

Description

@TomAugspurger

Followup to #190, we have an issue with using it in GridSearchCV. The hyperparameter optimizers expect that param_grid be a dict of {parameter: [ values ]}, but Incremental doesn't really take any parameters. We're really searching over the parameters of estimator. Some solutions

  1. Enhance scikit-learn to allow . access to attributes, something like param_grid={'estimator.alpha': [0.1, 10]}
  2. rewrite Incremental.set_params (and maybe get_params) to pass things through

I'm going to explore number 2 right now. Might send an email to the scikit-learn mailing list to see if other meta-estimators have run into this, though a quick search didn't turn anything up.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions