Can I run this on Dual GPU Tesla card? #748
-
I'm looking to get a running installing this webui on a separate machine that I use for various hobby projects. My main machine's GPU has 16gb of vram, but I see that old Tesla K80 cards have 24gb of vram. They are dirt cheap online, but I see that they are single card, dual GPU designs. Does the webui work with dual GPU designs like that? Its a dirt cheap way to get 24 gigs of vram even if the card is on the absolutely ancient Kepler architecture. But if both GPU's can be utilized to the fullest then I'd imagine is still miles ahead of CPU speeds. Would a 24GB M40 be better? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 4 replies
-
One of my machines where I run this project is Dell R720 server with two K40m Tesla cards - 12GB VRAM each. These are old cards, so to make text-generation working, I had to compile from sources my own version of Pytorch, which supports CUDA Computing Capabilities 3.5. After that, I can use both GPUs by running text-gen with --auto-devices and --gpu-memory parameters. |
Beta Was this translation helpful? Give feedback.
-
tesla K80 is super old and it is essentially 2 cards in one package 12 GB each this will reduce performance and increase memory usage even beyond old architecture |
Beta Was this translation helpful? Give feedback.
One of my machines where I run this project is Dell R720 server with two K40m Tesla cards - 12GB VRAM each. These are old cards, so to make text-generation working, I had to compile from sources my own version of Pytorch, which supports CUDA Computing Capabilities 3.5. After that, I can use both GPUs by running text-gen with --auto-devices and --gpu-memory parameters.