Skip to content

[ICML 2024] PyTorch implementation for "Diversified Batch Selection for Training Acceleration"

License

Notifications You must be signed in to change notification settings

MediaBrain-SJTU/DivBS

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Diversified Batch Selection for Training Acceleration

Paper Paper Github

by Feng Hong, Yueming Lyu, Jiangchao Yao, Ya Zhang, Ivor W. Tsang, and Yanfeng Wang at SJTU, A*STAR, Shanghai AI Lab, and NTU.

International Conference on Machine Learning (ICML), 2024.

This repository is the official Pytorch implementation of DivBS.

⚠️ This repository is being organized and updated continuously. Please note that this version is not the final release.

Citation

If you find our work inspiring or use our codebase in your research, please consider giving a star ⭐ and a citation.

@inproceedings{
hong2024diversified,
title={Diversified Batch Selection for Training Acceleration},
author={Feng Hong and Yueming Lyu and Jiangchao Yao and Ya Zhang and Ivor Tsang and Yanfeng Wang},
booktitle={ICML},
year={2024}
}

Environment

Create the environment for running our code:

conda create --name DivBS python=3.7.10
conda activate DivBS
pip install -r requirements.txt

Data Preparation

For CIFAR datasets, the data will be automatically downloaded by the code.

For Tiny-ImageNet, please download the dataset from here and unzip it to the _TINYIMAGENET folder. Then, run the following command to prepare the data:

cd _TINYIMAGENET
python val_folder.py

Running

CUDA_VISIBLE_DEVICES=0 python main.py --cfg cfg/cifar10_DivBS_01.yaml --seed 0 --wandb_not_upload 

The --wandb_not_upload is optional and is used to keep wandb log files locally without uploading them to the wandb cloud.

Contact

If you have any problem with this code, please feel free to contact [email protected].

About

[ICML 2024] PyTorch implementation for "Diversified Batch Selection for Training Acceleration"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%