Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What is this problem? #6

Closed
TAYLENHE opened this issue Nov 6, 2024 · 5 comments
Closed

What is this problem? #6

TAYLENHE opened this issue Nov 6, 2024 · 5 comments

Comments

@TAYLENHE
Copy link

TAYLENHE commented Nov 6, 2024

Phi3Transformer does not support an attention implementation through torch.nn.functional.scaled_dot_product_attention yet. Please request the support for this architecture: huggingface/transformers#28005. If you believe this error is a bug, please open an issue in Transformers GitHub repository and load your model with the argument attn_implementation="eager" meanwhile. Example: model = AutoModel.from_pretrained("openai/whisper-tiny", attn_implementation="eager")

What is this problem?

@MaisonMeta
Copy link

I have the same issue - On both Omnigen nodes available. Not sure how to fix ?

@1038lab
Copy link
Owner

1038lab commented Nov 6, 2024

What version of Python, PyTorch and CUDA are you using?

@MaisonMeta
Copy link

I found the fix:

Go to \ComfyUI_windows_portable\ComfyUI\models\LLM\OmniGen-v1
Open config.json
Scroll to the very bottom
change this:
"_attn_implementation": "sdpa"
to this:
"_attn_implementation": "eager"

worked for me hope it helps

@1038lab
Copy link
Owner

1038lab commented Nov 6, 2024

I've updated the custom node to support the eager attention implementation, so simply updating the custom node in comfyui-manager will resolve the issue.

This issue occurred due to using older versions of Python, PyTorch, or CUDA that don’t support the newer scaled_dot_product_attention (SDPA). With the eager implementation now in place, your setup should work without further issues.

For optimal performance and to fully benefit from SDPA, I recommend updating your Python, PyTorch, and CUDA versions when possible.

@1038lab 1038lab closed this as completed Nov 6, 2024
@MaisonMeta
Copy link

Thank you! Will update all.
What version should they be ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants