Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: Phi3Transformer does not support an attention implementation through torch.nn.functional.scaled_dot_product_attention yet. #23

Open
gunia10 opened this issue Nov 11, 2024 · 16 comments

Comments

@gunia10
Copy link

gunia10 commented Nov 11, 2024

I had an issue when Queue the from comfyui. It shows :

ValueError: Phi3Transformer does not support an attention implementation through torch.nn.functional.scaled_dot_product_attention yet. Please request the support for this architecture: huggingface/transformers#28005. If you believe this error is a bug, please open an issue in Transformers GitHub repository and load your model with the argument attn_implementation="eager" meanwhile. Example: model = AutoModel.from_pretrained("openai/whisper-tiny", attn_implementation="eager")

@plaidam
Copy link

plaidam commented Nov 11, 2024

same.

@1038lab
Copy link
Owner

1038lab commented Nov 11, 2024

updated , try the new version, hopefully it will fix all your issue

@gunia10
Copy link
Author

gunia10 commented Nov 12, 2024

Hi Thank you for the update, I delete the previous omnigen folder, run the github clone, and re run the requirements. still got the error:

ComfyUI Error Report

Error Details

Stack Trace

  File "D:\AI\ComfyUI\execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)

  File "D:\AI\ComfyUI\execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)

  File "D:\AI\ComfyUI\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)

  File "D:\AI\ComfyUI\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))

  File "D:\AI\ComfyUI\custom_nodes\ComfyUI-OmniGen\ailab_OmniGen.py", line 387, in generation
    raise e

  File "D:\AI\ComfyUI\custom_nodes\ComfyUI-OmniGen\ailab_OmniGen.py", line 340, in generation
    pipe = self._get_pipeline(model_precision, keep_in_vram)

  File "D:\AI\ComfyUI\custom_nodes\ComfyUI-OmniGen\ailab_OmniGen.py", line 302, in _get_pipeline
    raise RuntimeError(f"Failed to create pipeline: {str(e)}")

System Information

  • ComfyUI Version: v0.2.7-16-g2d28b0b
  • Arguments: main.py --windows-standalone-build
  • OS: nt
  • Python Version: 3.10.11 (tags/v3.10.11:7d4cc5a, Apr 5 2023, 00:38:17) [MSC v.1929 64 bit (AMD64)]
  • Embedded Python: false
  • PyTorch Version: 2.6.0.dev20241031+cu124

Devices

  • Name: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
    • Type: cuda
    • VRAM Total: 25756696576
    • VRAM Free: 24110956544
    • Torch VRAM Total: 0
    • Torch VRAM Free: 0

@theonetwoone
Copy link

ComfyUI\update\update_comfyui_and_python_dependencies.bat

@gunia10
Copy link
Author

gunia10 commented Nov 12, 2024

ComfyUI\update\update_comfyui_and_python_dependencies.bat

Thanks you for your help.
Is this for portable comfyui?
I can not find the folder Comfyui\update\update_comfyui_and_python_dependencies.bat

how can I update the python dependencies for direct installation?

@kasparovabi
Copy link

I am using google colab and there is no update folder or update.bat file, i did a git pull from my merge and it is up to date actually and deleted omnigen and git cloned again but still getting the error. any help would be appreciated

@theonetwoone
Copy link

@gunia10 it's from the root directory theres a good chance you are looking inside Comfyui\Comfyui\ buts it's only Comfyui\update\

@gunia10
Copy link
Author

gunia10 commented Nov 12, 2024

@gunia10 it's from the root directory theres a good chance you are looking inside Comfyui\Comfyui\ buts it's only Comfyui\update\

@theonetwoone checked them but not there, perhaps it is only a portable version has update folder.

@cdmusic2019
Copy link

Find config.json in the 'models' folder.
change this:
"_attn_implementation": "sdpa"
to this:
"_attn_implementation": "eager"

worked for me hope it helps

@1038lab
Copy link
Owner

1038lab commented Nov 12, 2024

Find config.json in the 'models' folder. change this: "_attn_implementation": "sdpa" to this: "_attn_implementation": "eager"

worked for me hope it helps

the latest update should adjust automatically,
if your config, not support "sdpa" , it will run in 'eager' automatically

@kasparovabi
Copy link

Find config.json in the 'models' folder. change this: "_attn_implementation": "sdpa" to this: "_attn_implementation": "eager"

worked for me hope it helps

This fixed my issue on google colab, thanks a lot

@gunia10
Copy link
Author

gunia10 commented Nov 13, 2024

Find config.json in the 'models' folder. change this: "_attn_implementation": "sdpa" to this: "_attn_implementation": "eager"

worked for me hope it helps

do you mean in comfyui/models/ folder? I can't find config.json file in the folder

@kasparovabi
Copy link

Find config.json in the 'models' folder. change this: "_attn_implementation": "sdpa" to this: "_attn_implementation": "eager"
worked for me hope it helps

do you mean in comfyui/models/ folder? I can't find config.json file in the folder

yeah it is in the models/LLM/OmniGen folder

@gunia10
Copy link
Author

gunia10 commented Nov 13, 2024

Find config.json in the 'models' folder. change this: "_attn_implementation": "sdpa" to this: "_attn_implementation": "eager"
worked for me hope it helps

do you mean in comfyui/models/ folder? I can't find config.json file in the folder

yeah it is in the models/LLM/OmniGen folder

thank you the error of phi3transformer now gone..

However when I run the Queue another error message come out:

ComfyUI Error Report

Error Details

  • Node Type: ailab_OmniGen
  • Exception Type: TypeError
  • Exception Message: object of type 'NoneType' has no len()

Stack Trace

  File "D:\AI\ComfyUI\execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)

  File "D:\AI\ComfyUI\execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)

  File "D:\AI\ComfyUI\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)

  File "D:\AI\ComfyUI\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))

  File "D:\AI\ComfyUI\custom_nodes\ComfyUI-OmniGen\ailab_OmniGen.py", line 387, in generation
    raise e

  File "D:\AI\ComfyUI\custom_nodes\ComfyUI-OmniGen\ailab_OmniGen.py", line 353, in generation
    output = pipe(

  File "D:\AI\ComfyUI\venv\lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)

  File "D:\AI\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\pipeline.py", line 200, in __call__
    assert isinstance(prompt, str) and len(input_images) == 1, "if you want to make sure the output image have the same size as the input image, please only input one image instead of multiple input images"

System Information

  • ComfyUI Version: v0.2.7-16-g2d28b0b
  • Arguments: main.py --windows-standalone-build
  • OS: nt
  • Python Version: 3.10.11 (tags/v3.10.11:7d4cc5a, Apr 5 2023, 00:38:17) [MSC v.1929 64 bit (AMD64)]
  • Embedded Python: false
  • PyTorch Version: 2.6.0.dev20241031+cu124

Devices

  • Name: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
    • Type: cuda
    • VRAM Total: 25756696576
    • VRAM Free: 8715658276
    • Torch VRAM Total: 15837691904
    • Torch VRAM Free: 454976548

@letscreate321
Copy link

ComfyUI\update\update_comfyui_and_python_dependencies.bat

This solution worked for me, using the portable ComfyUI. Thanks a lot !

@gunia10
Copy link
Author

gunia10 commented Nov 13, 2024

  • object of type 'NoneType' has no len()

it's fixed now. For some reason the workflow that come with the sample gives an error above.

what I did is to create new node to replace the existing node. Now its working

Thanks alot for all the help

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants