Skip to content
This repository has been archived by the owner on May 28, 2021. It is now read-only.

the batch didn't slip when use muti-gpu #38

Open
1453042287 opened this issue Dec 17, 2018 · 1 comment
Open

the batch didn't slip when use muti-gpu #38

1453042287 opened this issue Dec 17, 2018 · 1 comment

Comments

@1453042287
Copy link

when i use two gpu and set the batch_size=16, i found that the batch is 16 on per gpu, not 8, why?

@s-JoL
Copy link

s-JoL commented Jan 30, 2019

config["batch_size"] *= len(config["parallels"])
batch size means batch size per device

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants