Skip to content

Commit

Permalink
Merge pull request #73 from automl/development
Browse files Browse the repository at this point in the history
Release v0.1.0
  • Loading branch information
Bronzila authored Feb 17, 2024
2 parents 7f9d18e + 072ab54 commit 0ce86c6
Show file tree
Hide file tree
Showing 15 changed files with 749 additions and 16 deletions.
13 changes: 11 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,15 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [Unreleased]
## [0.1.0] - 2024-02-15

### Added
- Configuration IDs to improve logging and reproducability (#62)
- Ask and Tell interface (#36)
- Examples for Ask and Tell interface

### Changes
- Interface changes (renamed budget to fidelity) for clearer interface

## [0.0.7] - 2023-08-23

Expand Down Expand Up @@ -58,6 +66,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
### Added
- Initial project release and push to PyPI

[unreleased]: https://github.com/automl/DEHB/compare/v0.0.7...master
[unreleased]: https://github.com/automl/DEHB/compare/v0.1.0...master
[0.1.0]: https://github.com/automl/DEHB/compare/v0.0.7...v0.1.0
[0.0.7]: https://github.com/automl/DEHB/compare/v0.0.6...v0.0.7
[0.0.6]: https://github.com/automl/DEHB/releases/tag/v0.0.6
24 changes: 23 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,8 @@ pip install -e DEHB # -e stands for editable, lets you modify the code and reru
### Tutorials/Example notebooks

* [00 - A generic template to use DEHB for multi-fidelity Hyperparameter Optimization](examples/00_interfacing_DEHB.ipynb)
* [01 - Using DEHB to optimize 4 hyperparameters of a Scikit-learn's Random Forest on a classification dataset](examples/01_Optimizing_RandomForest_using_DEHB.ipynb)
* [01.1 - Using DEHB to optimize 4 hyperparameters of a Scikit-learn's Random Forest on a classification dataset](examples/01.1_Optimizing_RandomForest_using_DEHB.ipynb)
* [01.2 - Using DEHB to optimize 4 hyperparameters of a Scikit-learn's Random Forest on a classification dataset using Ask & Tell interface](examples/01.2_Optimizing_RandomForest_using_Ask_Tell.ipynb)
* [02 - Optimizing Scikit-learn's Random Forest without using ConfigSpace to represent the hyperparameter space](examples/02_using%20DEHB_without_ConfigSpace.ipynb)
* [03 - Hyperparameter Optimization for MNIST in PyTorch](examples/03_pytorch_mnist_hpo.py)

Expand All @@ -32,6 +33,27 @@ python examples/03_pytorch_mnist_hpo.py \
--verbose
```

#### Ask & Tell interface
DEHB allows users to either utilize the Ask & Tell interface for manual task distribution or leverage the built-in functionality (`run`) to set up a Dask cluster autonomously.
The Ask & Tell functionality can be utilized as follows:
```python
optimizer = DEHB(
f=your_target_function, # Here we do not need to necessarily specify the target function, but it can still be useful to call 'run' later.
cs=config_space,
dimensions=dimensions,
min_fidelity=min_fidelity,
max_fidelity=max_fidelity)

# Ask for next configuration to run
job_info = optimizer.ask()

# Run the configuration for the given fidelity. Here you can freely distribute the computation to any worker you'd like.
result = your_target_function(config=job_info["config"], fidelity=job_info["fidelity"])

# When you received the result, feed them back to the optimizer
optimizer.tell(job_info, result)
```

### Running DEHB in a parallel setting

DEHB has been designed to interface a [Dask client](https://distributed.dask.org/en/latest/api.html#distributed.Client).
Expand Down
1 change: 1 addition & 0 deletions docs/examples/00_interfacing_DEHB.ipynb
1 change: 1 addition & 0 deletions docs/examples/02_using_DEHB_without_ConfigSpace.ipynb
4 changes: 2 additions & 2 deletions docs/getting_started/parallel.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,8 +64,8 @@ bash utils/run_dask_setup.sh \
sleep 5

python examples/03_pytorch_mnist_hpo.py \
--min_budget 1 \
--max_budget 3 \
--min_fidelity 1 \
--max_fidelity 3 \
--runtime 60 \
--scheduler_file dask_dump/scheduler.json \
--verbose
Expand Down
14 changes: 7 additions & 7 deletions docs/getting_started/single_worker.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,16 +12,16 @@ Next, we need an `object_function`, which we are aiming to optimize:
```python exec="true" source="material-block" result="python" title="Configuration Space" session="someid"
import numpy as np

def objective_function(x: Configuration, budget: float, **kwargs):
def objective_function(x: Configuration, fidelity: float, **kwargs):
# Replace this with your actual objective value (y) and cost.
cost = (10 if x["x1"] == "red" else 100) + budget
cost = (10 if x["x1"] == "red" else 100) + fidelity
y = x["x0"] + np.random.uniform()
return {"fitness": y, "cost": x["x0"]}

sample_config = cs.sample_configuration()
print(sample_config)

result = objective_function(sample_config, budget=10)
result = objective_function(sample_config, fidelity=10)
print(result)
```

Expand All @@ -35,18 +35,18 @@ optimizer = DEHB(
f=objective_function,
cs=cs,
dimensions=dim,
min_budget=3,
max_budget=27,
min_fidelity=3,
max_fidelity=27,
eta=3,
n_workers=1,
output_path="./logs",
)

# Run optimization for 1 bracket. Output files will be saved to ./logs
traj, runtime, history = optimizer.run(brackets=1, verbose=True)
config, fitness, runtime, budget, _ = history[0]
config, fitness, runtime, fidelity, _ = history[0]
print("config", config)
print("fitness", fitness)
print("runtime", runtime)
print("budget", budget)
print("fidelity", fidelity)
```
1 change: 1 addition & 0 deletions examples/00_interfacing_DEHB.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Interfacing DEHB\n",
"#### How to read this notebook\n",
"\n",
"This notebook is designed to serve as a high-level, highly abstracted view of DEHB and how it can be used. The examples here are mere placeholders and *only* offer an interface to run DEHB on toy or actual problems.\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Optimizing RandomForest using DEHB\n",
"This notebook aims to build on the template from `00_interfacing_DEHB` and use it on an actual problem, to optimize the hyperparameters of a Random Forest model, for a dataset.\n",
"\n",
"Additional requirements:\n",
Expand Down
Loading

0 comments on commit 0ce86c6

Please sign in to comment.