-
Notifications
You must be signed in to change notification settings - Fork 81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
2024-8-28: Make CogVideox-5b run on 6 GB VRAM & Flux on 2 GB VRAM #121
Comments
There are no special versions of Flux, however Flux Dev is set up to use the bitsandbytes quantization(with updated dependencies), but I can't remember how much VRAM is needed, but it is less than 12 GB AFAIR. |
Hi tin2tin, many thanks for your response. I had already tried Flux Dev, but unfortunately Blender crashed immediately. I'll try again but I don't think I'll get very far with my 8GB. Many greetings |
On 8 GB you should be able to run all image models except the FLUX and maybe PixArt ones. |
Yes, I know. "torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 18.00 MiB. GPU |
Yes, CogVideoX is also needing a lot of VRAM. Where is "Make CogVideox-5b run on 6 GB VRAM & Flux on 2 GB VRAM" from? |
From your Startpage in the Change Log. |
The bitsandbytes implementation changed how Flux Dev runs. However, those values were the average vram usage, there might be spikes which makes it crash for you. |
Too bad, I guess I'll have to wait until a new graphics card is available. |
Hello everyone,
where can I find this 2GB Flux Version?
I only have the Flux Schnell and the Flux 1 Dev Versions in my selection and both require 24GB.
A bit too much for my GTX 1070 8GB Card.
Best regards
Hans
The text was updated successfully, but these errors were encountered: