You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a few questions about the implementation of cross validation in Neuralforecast. Currently, I am instantiating a NeuralForecasting object with many AutoModels and then running cross validation with a defined number of windows. I take the output of this crossval and run a custom eval function which gives me the best models. I want to take these best models and train them on all of the training data and then use them for predictions, without changing any of the hyperparameters. How can I extract the model parameters so that running a fit function would not cause them to be replaced with another set of hyper-parameter tuning?
Alternatively I could first run the fit function, then extract the best parameters and run a crossval on those models to find the best ones, However in this case, should I exclude the data that will be used for the crossval forecasts from the training data to ensure out of sample forecasts?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I have a few questions about the implementation of cross validation in Neuralforecast. Currently, I am instantiating a NeuralForecasting object with many AutoModels and then running cross validation with a defined number of windows. I take the output of this crossval and run a custom eval function which gives me the best models. I want to take these best models and train them on all of the training data and then use them for predictions, without changing any of the hyperparameters. How can I extract the model parameters so that running a fit function would not cause them to be replaced with another set of hyper-parameter tuning?
Alternatively I could first run the fit function, then extract the best parameters and run a crossval on those models to find the best ones, However in this case, should I exclude the data that will be used for the crossval forecasts from the training data to ensure out of sample forecasts?
Beta Was this translation helpful? Give feedback.
All reactions