Skip to content

CUDA out of memory error is not fixed #7398

Answered by asomoza
alexd725 asked this question in Q&A
Discussion options

You must be logged in to vote

To answer your questions:

Large batch sizes

This at inference is when you're using num_images_per_prompt which is not your case.

Large model architecture

Not your case because the model is just 3.2 GB.

Accumulating intermediate gradients

This is not for inference, is for when you train a model.

GPU memory leaks
Not freeing up memory

These two are probably your case. You only provide a class and not how you're using it, also you don't provide the images or the prompts or what is in effects.

I'm on windows right now so I can't use torch.compile but if I use just the generation code and do a loop with it:

import torch

from diffusers import EulerAncestralDiscreteScheduler, StableDiff…

Replies: 3 comments 3 replies

Comment options

You must be logged in to vote
1 reply
@alexd725
Comment options

Comment options

You must be logged in to vote
2 replies
@asomoza
Comment options

Answer selected by alexd725
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
4 participants