Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Adopt FluxPipeline to support a Flux models #114

Open
homoluden opened this issue Aug 9, 2024 · 4 comments
Open

[Feature] Adopt FluxPipeline to support a Flux models #114

homoluden opened this issue Aug 9, 2024 · 4 comments

Comments

@homoluden
Copy link

There is a new promising model.
https://huggingface.co/docs/diffusers/main/en/api/pipelines/flux

It would be good to take this model pipeline from diffusers package and add a canvas and attention / transformer additions from StableDiffusionXLOmostPipeline if possible.

Also, please take a note that there is a reduced T5 model available to save some VRAM
https://huggingface.co/comfyanonymous/flux_text_encoders/tree/main

@Nears-115
Copy link

If you want to save VRAM anyway, wouldn't it be better to use a model like this that uses BitsandBytes?

https://huggingface.co/lllyasviel/flux1-dev-bnb-nf4/blob/main/flux1-dev-bnb-nf4-v2.safetensors

@Lustwaffel
Copy link

Jesus, just check the response rate. Isn't it obvious that this is a dead horse?

@Lexxxco
Copy link

Lexxxco commented Oct 6, 2024

Omost may be a frozen project, but it is perfect for Flux - due to the fact how flux handles complex promts from Omost. @lllyasviel thanks for your amazing work, it would be great to see something like Omost with Flux.

@antonioo-c
Copy link

antonioo-c commented Nov 5, 2024

Check out https://github.com/instantX-research/Regional-Prompting-FLUX, we have implemented Regional Prompting for FLUX, it should be able to be used together with Omost!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants