-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Training time issue #1
Comments
Hi @Han8931. Can you provide more detail as to how you are training your model (e.g. dataset size, attack parameters, etc)? |
I just run a model with the provided configurations (BERT for IMDb). It seems even a single run takes around 8 hours. Actually it took more than two days in total. |
Can confirm this problem. Tried SNLI with a2t on a 2080Ti (batch size 12), the first clean epoch took 7 hours and the generation with a2t was estimated to take 21 hours. Tried again on a 3090 (batch size 32), the first clean epoch still took 3 hours. === Found the problem. In Is there any specific demand for padding to max length? I'll bring this issue to |
Hi, I am training a RoBERTa model with A2T but it seems it will take really long time.
is it normal to take this long?
The text was updated successfully, but these errors were encountered: