Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2024-8-28: Make CogVideox-5b run on 6 GB VRAM & Flux on 2 GB VRAM #121

Open
hansnolte opened this issue Nov 17, 2024 · 8 comments
Open

2024-8-28: Make CogVideox-5b run on 6 GB VRAM & Flux on 2 GB VRAM #121

hansnolte opened this issue Nov 17, 2024 · 8 comments

Comments

@hansnolte
Copy link

Hello everyone,

where can I find this 2GB Flux Version?

I only have the Flux Schnell and the Flux 1 Dev Versions in my selection and both require 24GB.
A bit too much for my GTX 1070 8GB Card.

Best regards
Hans

@tin2tin
Copy link
Owner

tin2tin commented Nov 17, 2024

There are no special versions of Flux, however Flux Dev is set up to use the bitsandbytes quantization(with updated dependencies), but I can't remember how much VRAM is needed, but it is less than 12 GB AFAIR.

@hansnolte
Copy link
Author

Hi tin2tin,

many thanks for your response.

I had already tried Flux Dev, but unfortunately Blender crashed immediately.

I'll try again but I don't think I'll get very far with my 8GB.
I wanted to buy a new graphics card next year anyway.

Many greetings
Hans

@tin2tin
Copy link
Owner

tin2tin commented Nov 17, 2024

On 8 GB you should be able to run all image models except the FLUX and maybe PixArt ones.

@hansnolte
Copy link
Author

On 8 GB you should be able to run all image models except the FLUX and maybe PixArt ones.

Yes, I know.
But, what about that "Make CogVideox-5b run on 6 GB VRAM & Flux on 2 GB VRAM"
CogVideox-5b crashes immediately and Flux 1 Dev give me an CUDA out of memory.

"torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 18.00 MiB. GPU
GPUFrameBuffer: &framebuffer.fb status GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT
Error: Region could not be drawn!"

@tin2tin
Copy link
Owner

tin2tin commented Nov 17, 2024

Yes, CogVideoX is also needing a lot of VRAM.

Where is "Make CogVideox-5b run on 6 GB VRAM & Flux on 2 GB VRAM" from?

@hansnolte
Copy link
Author

From your Startpage in the Change Log.
https://github.com/tin2tin/Pallaidium
2024-8-28: Make CogVideox-5b run on 6 GB VRAM & Flux on 2 GB VRAM

@tin2tin
Copy link
Owner

tin2tin commented Nov 17, 2024

The bitsandbytes implementation changed how Flux Dev runs. However, those values were the average vram usage, there might be spikes which makes it crash for you.

@hansnolte
Copy link
Author

Too bad, I guess I'll have to wait until a new graphics card is available.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants