Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Hyperopt example for BERT classifier #186

Merged
merged 7 commits into from
Sep 6, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions docs/code/data.rst
Original file line number Diff line number Diff line change
Expand Up @@ -130,6 +130,11 @@ Data Loaders
Data Iterators
===============

:hidden:`Batch`
~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: texar.torch.data.Batch
:members:

:hidden:`DataIterator`
~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: texar.torch.data.DataIterator
Expand Down
24 changes: 24 additions & 0 deletions examples/bert/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,8 @@ To summarize, this example showcases:
* Building and fine-tuning on downstream tasks
* Use of Texar `RecordData` module for data loading and processing
* Use of Texar `Executor` module for simplified training loops and TensorBoard visualization
* Use of [Hyperopt]((https://github.com/hyperopt/hyperopt)) library to tune hyperparameters with
`Executor` module

Future work:

Expand Down Expand Up @@ -178,3 +180,25 @@ tensorboard --logdir runs/
```

![Visualizing loss/accuarcy on Tensorboard](tbx.png)

## Hyperparameter tuning with Executor

To run this example, please install `hyperopt` by issuing the following command

```commandline
pip install hyperopt
```

`bert_with_hypertuning_main.py` shows an example of how to tune hyperparameters with Executor using `hyperopt`.
To run this example, run the following command

```commandline
python bert_with_hypertuning_main.py
```

In this simple example, the hyperparameters to be tuned are provided as a `dict` in
`bert_hypertuning_config_classifier.py` which are fed into `objective_func()` . We use `TPE`
(Tree-structured Parzen Estimator) algorithm for tuning the hyperparams (provided in `hyperopt`
library). The example runs for 3 trials to find the best hyperparam settings. The final model is
saved in `output_dir` provided by the user. More information about the libary can be
found at [Hyperopt](https://github.com/hyperopt/hyperopt)
2 changes: 1 addition & 1 deletion examples/bert/bert_classifier_main.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@
config_downstream = importlib.import_module(args.config_downstream)
config_downstream = {
k: v for k, v in config_downstream.__dict__.items()
if not k.startswith('__')}
if not k.startswith('__') and k != "hyperparams"}

device = torch.device("cuda" if torch.cuda.is_available() else "cpu")

Expand Down
2 changes: 1 addition & 1 deletion examples/bert/bert_classifier_using_executor_main.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@
config_downstream = importlib.import_module(args.config_downstream)
config_downstream = {
k: v for k, v in config_downstream.__dict__.items()
if not k.startswith('__')}
if not k.startswith('__') and k != "hyperparams"}

device = torch.device("cuda" if torch.cuda.is_available() else "cpu")

Expand Down
Loading