-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ValueError: Phi3Transformer does not support an attention implementation through torch.nn.functional.scaled_dot_product_attention yet. #23
Comments
same. |
updated , try the new version, hopefully it will fix all your issue |
Hi Thank you for the update, I delete the previous omnigen folder, run the github clone, and re run the requirements. still got the error: ComfyUI Error ReportError Details
Stack Trace
System Information
Devices
|
ComfyUI\update\update_comfyui_and_python_dependencies.bat |
Thanks you for your help. how can I update the python dependencies for direct installation? |
I am using google colab and there is no update folder or update.bat file, i did a git pull from my merge and it is up to date actually and deleted omnigen and git cloned again but still getting the error. any help would be appreciated |
@gunia10 it's from the root directory theres a good chance you are looking inside Comfyui\Comfyui\ buts it's only Comfyui\update\ |
@theonetwoone checked them but not there, perhaps it is only a portable version has update folder. |
Find config.json in the 'models' folder. worked for me hope it helps |
the latest update should adjust automatically, |
This fixed my issue on google colab, thanks a lot |
do you mean in comfyui/models/ folder? I can't find config.json file in the folder |
yeah it is in the models/LLM/OmniGen folder |
thank you the error of phi3transformer now gone.. However when I run the Queue another error message come out: ComfyUI Error ReportError Details
Stack Trace
System Information
Devices
|
This solution worked for me, using the portable ComfyUI. Thanks a lot ! |
it's fixed now. For some reason the workflow that come with the sample gives an error above. what I did is to create new node to replace the existing node. Now its working Thanks alot for all the help |
I had an issue when Queue the from comfyui. It shows :
ValueError: Phi3Transformer does not support an attention implementation through torch.nn.functional.scaled_dot_product_attention yet. Please request the support for this architecture: huggingface/transformers#28005. If you believe this error is a bug, please open an issue in Transformers GitHub repository and load your model with the argument
attn_implementation="eager"
meanwhile. Example:model = AutoModel.from_pretrained("openai/whisper-tiny", attn_implementation="eager")
The text was updated successfully, but these errors were encountered: