Hydro is a system that automatically applies the hyperparameter transfer theory together with multiple system techniques to jointly improve the tuning efficiency. To learn more about how Hydro works, please refer our paper.
We highlight three key features of Hydro:
🚀 Efficient Tuning. Hydro scales down the model size and fuses multiple trials can significantly improve training throughput and hardware efficiency.
✨ Automatic Pipeline. Hydro streamlines the surrogate model generation process and seamlessly integrates with Ray, offering a user-friendly solution for tuning.
🎉 Quality Maintenance. Hydro typically can well maintain the tuned model quality even using a scaled and fused model for tuning.
Refer to Getting started for complete instructions on environment setup, installation, and integration.
Hydro is published as a Python package and can be installed with pip directly.
pip install hydro-tune
We also provide a Docker image fully equipped with all dependencies and environments.
docker pull tonyhao96/hydro
We provide working examples for end-to-end hyperparameter tuning inside examples
folder.
-
vision
: Image Classification Example -
language
: Language Modeling Example
Please check osdi23-artifact
branch for the artifact evaluation version of Hydro.
If you find this code useful in your research, please consider citing:
@inproceedings{Hydro,
author = {Qinghao Hu and Zhisheng Ye and Meng Zhang and Qiaoling Chen and Peng Sun and Yonggang Wen and Tianwei Zhang},
title = {Hydro: {Surrogate-Based} Hyperparameter Tuning Service in Datacenters},
booktitle = {17th USENIX Symposium on Operating Systems Design and Implementation},
year = {2023},
publisher = {USENIX Association},
series = {OSDI '23}
}
Hydro is built upon many fabulous open-source repositories, including
ray | mup | hfta | pytorch | transformers