You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the current implementation, we sample the hyperparameters of the Gaussian process and average across those samples. The training data is fixed. In BoTorch, the hyperparameters are fixed and the observations y of the data points are sampled, and the acquisition function is averaged over those.
I think both ideas can be combined, allowing the user to request:
Holding both hyperparameters and observations fixed: This will yield the classical acquisition functions for the noiseless case.
Sampling only the observations y. This will yield the noisy acquisition function versions, assuming that the hyperparameters are accurate.
Sampling only the hyperparameters. This will yield the current behavior, where we are robust to model misspecification.
Sampling both the hyperparameters and the observations. Combining the strengths of both approaches.
The steps required are:
Implement a fast method for cloning the GP model using the same, or different hyperparameters.
kiudee
changed the title
Augment hyperposterior sampling by also resampling observed values
Augment hyperposterior sampling by also sampling observed values
Oct 12, 2021
In the current implementation, we sample the hyperparameters of the Gaussian process and average across those samples. The training data is fixed. In BoTorch, the hyperparameters are fixed and the observations y of the data points are sampled, and the acquisition function is averaged over those.
I think both ideas can be combined, allowing the user to request:
The steps required are:
bayes-skopt/bask/acquisition.py
Line 48 in 8f1daf9
The text was updated successfully, but these errors were encountered: