Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possible to Swap VRAM out into system RAM to simulate Multi-GPU? #472

Open
NaLG opened this issue Feb 23, 2019 · 0 comments
Open

Possible to Swap VRAM out into system RAM to simulate Multi-GPU? #472

NaLG opened this issue Feb 23, 2019 · 0 comments

Comments

@NaLG
Copy link

NaLG commented Feb 23, 2019

As similarly noted in /issues/410 - I've noticed that for multi-GPU runs, only one GPU seems to be processing at a time, while the memory is split between all of them. In my runs, a 3000x2000 pixel image between 4 GPUs, it only swaps which GPU is active only a handful of times.

How feasible would it be to have a swap space in system memory (or even pagefile) for multiple 'virtual' GPUs? The single active 'virtual GPU' has its VRAM loaded in from swap while those layers are being updated, and swapped back out to make room for the next 'virtual GPU's memory.

A single high-VRAM GPU and additional Optane/NVMe/RAM is much easier to get than multiple high VRAM cards. Is there anything technically speaking about neural-style that would prevent this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant