Skip to content

Can I run this on Dual GPU Tesla card? #748

Answered by JaroslawHryszko
Mchdms asked this question in Q&A
Discussion options

You must be logged in to vote

One of my machines where I run this project is Dell R720 server with two K40m Tesla cards - 12GB VRAM each. These are old cards, so to make text-generation working, I had to compile from sources my own version of Pytorch, which supports CUDA Computing Capabilities 3.5. After that, I can use both GPUs by running text-gen with --auto-devices and --gpu-memory parameters.

Replies: 2 comments 4 replies

Comment options

You must be logged in to vote
4 replies
@Mchdms
Comment options

@JaroslawHryszko
Comment options

@rflpsz
Comment options

@Gee1111
Comment options

Answer selected by Mchdms
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
5 participants