- Add expected_minimum_random_sampling
- New plot examples
- Add more parameter to plot_objective
- Return ordereddict in point_asdict
- update_next() and get_results() added to Optimize
- Fix searchcv rank (issue #831)
- Fix random forest regressor (issue #766)
- Fix doc examples
- Fix integer normalize by using round()
- Fix random forest regressor (Add missing min_impurity_decrease)
- Fix license detection in github
- Add doctest to CI
- Sphinx documentation
- notebooks are replaced by sphinx-gallery
- New StringEncoder, can be used in Categoricals
- Remove string conversion in Identity
- dtype can be set in Integer and Real
- Fix categorical space (issue #821)
- int can be set as dtype to fix issue #790
- Old pdoc scripts are removed and replaced by sphinx
- Models queue has now a customizable size (model_queue_size).
- Add log-uniform prior to Integer space
- Support for plotting categorical dimensions
- Allow BayesSearchCV to work with sklearn 0.21
- Reduce the amount of deprecation warnings in unit tests
- joblib instead of sklearn.externals.joblib
- Improve travis CI unit tests (Different sklearn version are checked)
- Added
versioneer
support, to keep things simple and to fix pypi deploy
Highly composite six.
plot_regret
function for plotting the cumulative regret; The purpose of such plot is to access how much an optimizer is effective at picking good points.CheckpointSaver
that can be used to save a checkpoint after each iteration with skopt.dumpSpace.from_yaml()
to allow for external file to define Space parameters
- Fixed numpy broadcasting issues in gaussian_ei, gaussian_pi
- Fixed build with newest scikit-learn
- Use native python types inside BayesSearchCV
- Include fit_params in BayesSearchCV refit
- Added
versioneer
support, to reduce changes with new version of theskopt
- Separated
n_points
fromn_jobs
inBayesSearchCV
. - Dimensions now support boolean np.arrays.
matplotlib
is now an optional requirement (install withpip install 'scikit-optimize[plots]'
)
High five!
- Single element dimension definition, which can be used to fix the value of a dimension during optimization.
total_iterations
property ofBayesSearchCV
that counts total iterations needed to explore all subspaces.- Add iteration event handler for
BayesSearchCV
, useful for early stopping insideBayesSearchCV
search loop. - added
utils.use_named_args
decorator to help with unpacking named dimensions when calling an objective function.
- Removed redundant estimator fitting inside
BayesSearchCV
. - Fixed the log10 transform for Real dimensions that would lead to values being out of bounds.
Go forth!
- Support early stopping of optimization loop.
- Benchmarking scripts to evaluate performance of different surrogate models.
- Support for parallel evaluations of the objective function via several constant liar stategies.
- BayesSearchCV as a drop in replacement for scikit-learn's GridSearchCV.
- New acquisition functions "EIps" and "PIps" that takes into account function compute time.
- Fixed inference of dimensions of type Real.
- Change interface of GradientBoostingQuantileRegressor's predict method to match return type of other regressors
- Dimensions of type Real are now inclusive of upper bound.
Third time's a charm.
- Accuracy improvements of the optimization of the acquisition function
by pre-selecting good candidates as starting points when
using
acq_optimizer='lbfgs'
. - Support a ask-and-tell interface. Check out the
Optimizer
class if you need fine grained control over the iterations. - Parallelize L-BFGS minimization runs over the acquisition function.
- Implement weighted hamming distance kernel for problems with only categorical dimensions.
- New acquisition function
gp_hedge
that probabilistically chooses one ofEI
,PI
orLCB
at every iteration depending upon the cumulative gain.
- Warnings are now raised if a point is chosen as the candidate optimum multiple times.
- Infinite gradients that were raised in the kernel gradient computation are now fixed.
- Integer dimensions are now normalized to [0, 1] internally in
gp_minimize
.
- The default
acq_optimizer
function has changed from"auto"
to"lbfgs"
ingp_minimize
.
- Speed improvements when using
gp_minimize
withacq_optimizer='lbfgs'
andacq_optimizer='auto'
when all the search-space dimensions are Real. - Persistence of minimization results using
skopt.dump
andskopt.load
. - Support for using arbitrary estimators that implement a
return_std
argument in theirpredict
method by means ofbase_minimize
fromskopt.optimizer.
- Support for tuning noise in
gp_minimize
using thenoise
argument. TimerCallback
inskopt.callbacks
to log the time between iterations of the minimization loop.
First light!
- Bayesian optimization via
gp_minimize
. - Tree-based sequential model-based optimization via
forest_minimize
andgbrt_minimize
, with support for multi-threading. - Support of LCB, EI and PI as acquisition functions.
- Plotting functions for inspecting convergence, evaluations and the objective function.
- API for specifying and sampling from a parameter space.
See AUTHORS.md
.