Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SLEP016: parameter spaces on estimators #62

Open
wants to merge 14 commits into
base: main
Choose a base branch
from

Conversation

jnothman
Copy link
Member

No description provided.

Copy link
Member

@amueller amueller left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That was quick! Some nitpicks and one question about allowing __. I think there are two important limitations: dependencies within an estimator and sharing parameters across estimators.
Basically this model assumes parameters and estimator grids are one-to-one.

slep014/proposal.rst Outdated Show resolved Hide resolved
slep014/proposal.rst Outdated Show resolved Hide resolved
slep014/proposal.rst Outdated Show resolved Hide resolved
slep014/proposal.rst Outdated Show resolved Hide resolved
slep014/proposal.rst Outdated Show resolved Hide resolved
slep014/proposal.rst Outdated Show resolved Hide resolved
slep014/proposal.rst Outdated Show resolved Hide resolved
slep014/proposal.rst Outdated Show resolved Hide resolved
slep014/proposal.rst Outdated Show resolved Hide resolved
slep014/proposal.rst Outdated Show resolved Hide resolved
Copy link
Member Author

@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for that speedy and detailed first pass @amueller

slep014/proposal.rst Outdated Show resolved Hide resolved
slep014/proposal.rst Outdated Show resolved Hide resolved
slep014/proposal.rst Outdated Show resolved Hide resolved
slep014/proposal.rst Outdated Show resolved Hide resolved
slep014/proposal.rst Outdated Show resolved Hide resolved
slep014/proposal.rst Outdated Show resolved Hide resolved
@jnothman jnothman changed the title Partial draft of SLEP014: parameter spaces on estimators Partial draft of SLEP016: parameter spaces on estimators Feb 2, 2022
@jnothman
Copy link
Member Author

Ready for review!

Copy link
Member

@adrinjalali adrinjalali left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think I prefer the Searchgrid design, but I'm taking your word on people preferring this design to that one.

slep016/proposal.rst Outdated Show resolved Hide resolved
slep016/proposal.rst Outdated Show resolved Hide resolved
slep016/proposal.rst Show resolved Hide resolved
slep016/proposal.rst Outdated Show resolved Hide resolved
slep016/proposal.rst Outdated Show resolved Hide resolved
slep016/proposal.rst Outdated Show resolved Hide resolved
slep016/proposal.rst Outdated Show resolved Hide resolved
slep016/proposal.rst Show resolved Hide resolved
@jnothman jnothman changed the title Partial draft of SLEP016: parameter spaces on estimators SLEP016: parameter spaces on estimators Mar 19, 2022
@jnothman
Copy link
Member Author

Thanks for the prompt review @adrinjalali. At this point I'm not sure what to update based on your comments, but I am curious to understand better what you find attractive about the Searchgrid API relative to this.

@jnothman
Copy link
Member Author

Note to self: an alternative here might be a way to alias deep parameters at the metaestimator level...

@jnothman
Copy link
Member Author

Thanks for the review @amueller. I hope I find clarity of mind and time soon to make edits.

Over all, do you feel like we should have separate set_grid and set_distrib, or combined set_param_space whose rvs will be rejected by a grid search?

One thought that comes to mind is how to make this something that usefully integrates with external hyperparameter optimisation tools. I mean it may be their responsibility to do so, but I wonder how we assist in their support.

@aidiss
Copy link

aidiss commented Jul 21, 2023

I would like to propose an alternative approach.
Core idea is to use mapping between an object and space, and then iterate through given pipeline and build the parameter_grid.

    pipe = make_pipeline(DecisionTree())
    object_param_grid = {DecisionTree: {"max_depth": [1, 2]}
    create_param_grid(pipe, object_param_grid) == {'decisiontree__max_depth': [1, 2]}

This is useful when the pipeline is not complex and there are no instances that use different params.

It is possible to change this a bit and use instance_to_space mapping.

    tree = DecisionTree()
    pipe = make_pipeline()
    object_param_grid = {tree: {"max_depth": [1, 2]}
    create_param_grid(pipe, object_param_grid) == {'decisiontree__max_depth': [1, 2]}

This would add more granularity, but would requite instantiating transformers/estimators before generating the grid.

Here is a source that achieves object_to_paramgrid mapping.

def _get_steps(object_, ):
    """Retrieves steps/transformers from Pipeline, FeatureUnion or ColumnTransformer objects"""
    object_to_step_name = {
        Pipeline: "steps", 
        FeatureUnion: "transformer_list",
        ColumnTransformer: "transformers",
    }

    if step_name := object_to_step_name.get(object_.__class__):
        steps = [steps for steps in getattr(object_, step_name)]
        return steps
    
def resolve(parent_object, object_param_grid, path="", param_grid=None):
    if param_grid is None:
        param_grid = {}

    steps = _get_steps(parent_object)
    if not steps:
        return param_grid

    for child_object_path, child_object, *_ in steps:
        full_path = '__'.join([path, child_object_path]).strip("__")
        child_object_class_name = child_object.__class__

        if object_params := object_param_grid.get(child_object_class_name):
            flattened_param_grid = {f"{full_path}__{k}": v for k, v in object_params.items()}
            param_grid = flattened_param_grid | param_grid

        param_grid = resolve(child_object, object_param_grid, path=path+child_object_path, param_grid=param_grid)
    return param_grid

@jnothman
Copy link
Member Author

jnothman commented Dec 27, 2023

I like several aspects of that proposal, @aidiss, though I'm not sure I'd worry about supporting classes in the first version. Specifically:

  • The entire parameter grid can be defined explicitly in one place in user code, or can be assembled from multiple places quite easily
  • The effect of this change could be localised to *SearchCV implementations, with such a dict being a drop-in replacement for the current param_grid or param_distributions. Therefore the change would not require a SLEP.
  • It may get rid of the need for a specialised design to handle RVs vs grids.

One thing I'm not sure about is how to manage errors. If, for instance, the user were to clone the estimator, we'd suddenly have no grid for it if it were defined in terms of instances, and that would not be obvious to the user. On the other hand, I think this is a flaw in all or several the proposed designs.

I'll otherwise need to think about whether there are aspects of searchgrid capability that could not be reproduced with this approach.

@jnothman
Copy link
Member Author

I've recalled @aidiss that your suggestion is pretty much the same thing that I proposed in scikit-learn/scikit-learn#21784, although I admit that the .set interface is not particularly pleasant.

@jnothman
Copy link
Member Author

I think I prefer the Searchgrid design, but I'm taking your word on people preferring this design to that one.

I wonder if you meant the GridFactory design, @adrinjalali? Searchgrid secretly sets attributes on estimators, so it's not a great functional/OOP design.

@jnothman
Copy link
Member Author

jnothman commented Dec 29, 2023

I'd be keen to merge this and then have the team decide if it's the right proposal, or if we should go down one of the SLEP-free paths (e.g. GridFactory, or a lighter API variant of it like above).

We have a lot of the implementation already, and mostly need to come to a decision.

@adrinjalali interested in reviewing for merge?

Copy link
Member

@adrinjalali adrinjalali left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Apart from 2 reservations, I quite like this now.

These could be combined into a single method, such that
:class:`~sklearn.model_selection.GridSearchCV` rejects a call to `fit` where `rvs`
appear. This would make it harder to predefine search spaces that could be used
for either exhaustive or randomised searches, which may be a use case in Auto-ML.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in those cases, they'd need to have separate calls to set_search_grid and set_search_rvs anyway (unless I'm misunderstanding something). So in terms of practicality, they can still create two instances of the pipeline, one with seting rvs and one with setting the grid, but via a single set_search_space method.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What I'm trying to communicate here is that if we had a single set_search_space, an AutoML library that is set up to call it would have to choose between setting RVs and grids. But this presumes a lot of things about how an AutoML library using this API might look.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure if I should change anything here.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Inserted that clarification into the text.

Comment on lines 309 to 311
Another possible alternative is to have `set_search_grid` update rather than
replace the existing search space, to allow for incremental construction. This is
likely to confuse users more than help.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

isn't this your proposal now?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed

Comment on lines +92 to +95
).set_search_grid(reduce_dim=[
PCA(iterated_power=7).set_search_grid(n_components=N_FEATURES_OPTIONS),
SelectKBest().set_search_grid(k=N_FEATURES_OPTIONS),
])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One issue here is that the user still needs to know that the first step is called "reduce_dim", which is a caveat you mentioned in the motivation section. I'm not sure how to improve this.

I think this is an improvement over what we have, but it still feels quite verbose maybe? I'm not sure.

Copy link
Member Author

@jnothman jnothman Mar 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The way to get around it is with more of a factory API for things like Pipeline, whether it's with a strict factory pattern:

pipeline = (PipelineFactory()
  .add(
    grid=[
      PCA(iterated_power=7).set_search_grid(n_components=N_FEATURES_OPTIONS),
      SelectKBest().set_search_grid(k=N_FEATURES_OPTIONS
    ])
  .add(estimator=svc, name="classify")
).build()

or with a mutable Pipeline

pipeline = Pipeline()
pipeline.add(
  grid=[
    PCA(iterated_power=7).set_search_grid(n_components=N_FEATURES_OPTIONS),
    SelectKBest().set_search_grid(k=N_FEATURES_OPTIONS
  ])
pipeline.add(svc, name="classify")

but these solutions are clearly orthogonal to the current proposal

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thinking of a factory reminds me of the work @koaning is doing in the playground project.

Was wondering what you'd think about this proposal related to your experiments for creating pipelines and how this could potentially fit there @koaning

Copy link

@koaning koaning May 23, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you're referring to playtime? If so, that approach revolves around operators, so you might do stuff like:

pipeline = feats("age", "fare", "sibsp", "parch") + onehot("sex", "pclass")

That's meant to keep things super simple for folks with a modelling background but who are light on programming. I am conciously avoiding hyperparameters in my experiment there in an attempt to focus more on the data.

What folks are discussing here seems to be different. It seems to be a less "string"-y way to declare what hyperparams you would need to set in a gridsearch. So these two pieces of work seem to address different concerns.

That said, the string-y way of declaring things has never really bothered me that much. I usually found it much harder to deal with components that depend on each-other. Things like "I want to impute values in my pipeline if the final estimator is a logistic regressor but I want no imputing when the final estimator is a histogram boosted model". Would this be something we could tackle here as well? I may be able to come up with better examples that related directly to hyperparameters instead of including/excluding estimators but this aspect of dependencies within a pipeline has always been the one thing I found hard to tackle with sklearn automatically.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe something with PCA and SelectKBest?

make_pipeline(PCA(n1), SelectKBest(n2))

Suppose that PCA has components going from n1=1..10 and SelectKBest also has n2=1..10. Then you are going to hit an issue when there is only 1 PCA component but SelectKBest wants to select 10. So this might be a "nice" example of a dependency based on hyperparams.

@jnothman jnothman requested a review from adrinjalali March 20, 2024 22:32
Copy link
Member

@adrinjalali adrinjalali left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Two things that came to my mind this time:

  • What happens to classes such as LassoCV? Do they take the grid from themselves?
  • We have a bunch of classes which do a CV search, but not over parameters of the child class. Like TunedThresholdClassifierCV. We should find a way to make it very clear and discoverable to users that they do NOT search over potentially provided parameter grid.

I'm still not convinced by having both .._grid and .._rvs method pairs, but I wouldn't want that to be a blocker here to move forward.

I'd be happy to merge this soon-ish, request for comment, and get to a vote after.

Comment on lines +92 to +95
).set_search_grid(reduce_dim=[
PCA(iterated_power=7).set_search_grid(n_components=N_FEATURES_OPTIONS),
SelectKBest().set_search_grid(k=N_FEATURES_OPTIONS),
])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thinking of a factory reminds me of the work @koaning is doing in the playground project.

Was wondering what you'd think about this proposal related to your experiments for creating pipelines and how this could potentially fit there @koaning

space to remain constant regardless of whether ``reduce_dim`` is a feature
selector or a PCA.

This SLEP proposes to add a methods to estimators that allow the user
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we're adding here 4 methods, aren't we? But I'm not sure if we want to introduce them here or after the example bellow. I'm okay generally with the text as is. Maybe we just want here to say we add 4 methods and let the details be left for later.

Comment on lines +140 to +141
Note that this parameter space has no effect when the estimator's own
``fit`` method is called, but can be used by model selection utilities.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since I was thinking about HalvingSearchCV with @glemaitre , there's something tricky I notice here:

HalvingSearchCV's resource parameter can be one of the estimator's parameters itself, which is on top of the param_grid parameter. It's clear how this proposal interacts with param_grid, but not clear to me how it would interact with resource there. And makes me think there might be some cases we're missing here?

Comment on lines +238 to +240
we might need to store the candidate grids and distributions in a known instance
attribute, or use a combination of `get_grid`, `get_distribution`, `get_params`
and `set_search_grid`, `set_search_rvs` etc. to perform `clone`.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the meantime, we have the configurable clone and also this for metadata routing:

    try:
        new_object._metadata_request = copy.deepcopy(estimator._metadata_request)
    except AttributeError:
        pass

So I think we can remove this paragraph since it's a non-issue at this point?

Comment on lines +354 to +356
``cv_results_`` is similarly affected by large changes to its keys when small
changes are made to the composite model structure. Future work could provide
tools to make ``cv_results_`` more accessible and invariant to model structure.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

since this new way of providing the grid requires explicit act of the user via changing the value of param_grid passed to search objects, I think at the same time for those cases we can change the structure of cv_results_?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

5 participants