-
-
Notifications
You must be signed in to change notification settings - Fork 25.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update _coordinate_descent.py #30416
base: main
Are you sure you want to change the base?
Conversation
Purpose Allow users to disable refitting of cross-validation estimators like LassoCV on the full training set after finding the best hyperparameters, to save computational resources. Added refit Parameter: Introduced a refit parameter to the LassoCV class to control refitting behavior. Conditional Refitting: Modified the fit method to refit the model only if refit=True. Outcome This change provides flexibility and efficiency, especially for large datasets, by allowing users to skip unnecessary refitting.
❌ Linting issuesThis PR is introducing linting issues. Here's a summary of the issues. Note that you can avoid having linting issues by enabling You can see the details of the linting issues under the
|
Purpose
The goal was to allow users to disable the refitting of cross-validation estimators, such as
LassoCV
, on the full training set after finding the best hyperparameters. This feature is particularly useful for saving computational resources, especially with large datasets.Changes Made
1. Added
refit
Parameterrefit
parameter in theLassoCV
class constructor. This parameter enables users to control whether the model should be refitted on the entire training set.2.Modified fit Method
Updated the fit method to conditionally refit the model based on the refit parameter. If refit is set to True, the model is refitted on the full training set; otherwise, the model is not refitted.
Code Changes:
3.Outcome
4.Uses
The implemented feature allows users to disable the refitting of LassoCV on the full training set after finding the best hyperparameters, enhancing both flexibility and efficiency in model training.