You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would like to introduce support for the langevin=True parameter in LightAutoML. This parameter is the Stochastic Gradient Langevin Boosting (SGLB) method, which is a powerful and efficient machine learning framework capable of handling a wide range of loss functions and providing provable generalization guarantees. The method is based on a special form of the Langevin diffusion equation specifically designed for gradient boosting. This allows us to theoretically guarantee global convergence even for multimodal loss functions, while standard gradient boosting algorithms can only guarantee local optimum Paper
Proposal
I propose that LightAutoML support the langevin=True parameter during hyperparameter tuning of CatBoost models. This would allow users to leverage the benefits of SGLB when tuning CatBoost models using LightAutoML.
Alternatives
As an alternative, users could manually set the langevin parameter when creating a CatBoost instance. However, this could be less convenient and efficient than having LightAutoML automatically tune the parameters.
Additional context
I have successfully used the langevin=True parameter during a Kaggle competition. This experience has shown me the potential benefits of this parameter, and I believe it would be beneficial to have this feature in LightAutoML.
🚀 Feature Request
Motivation
I would like to introduce support for the
langevin=True
parameter in LightAutoML. This parameter is the Stochastic Gradient Langevin Boosting (SGLB) method, which is a powerful and efficient machine learning framework capable of handling a wide range of loss functions and providing provable generalization guarantees. The method is based on a special form of the Langevin diffusion equation specifically designed for gradient boosting. This allows us to theoretically guarantee global convergence even for multimodal loss functions, while standard gradient boosting algorithms can only guarantee local optimum PaperProposal
I propose that LightAutoML support the
langevin=True
parameter during hyperparameter tuning of CatBoost models. This would allow users to leverage the benefits of SGLB when tuning CatBoost models using LightAutoML.Alternatives
As an alternative, users could manually set the
langevin
parameter when creating a CatBoost instance. However, this could be less convenient and efficient than having LightAutoML automatically tune the parameters.Additional context
I have successfully used the
langevin=True
parameter during a Kaggle competition. This experience has shown me the potential benefits of this parameter, and I believe it would be beneficial to have this feature in LightAutoML.More details: #5 Solution
The text was updated successfully, but these errors were encountered: