Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Had to install LLM/OmniGen-v1 files manually, then encountered runtime error in ComfyUI-OmniGen #24

Open
kaptaind101 opened this issue Nov 12, 2024 · 3 comments

Comments

@kaptaind101
Copy link

I have the NVIDIA version of ComfyUI installed on a Windows 11 PC. I loaded the ComfyUI-OmniGen node into ComfyUI via the ComfyUI Manager. That seemed to work OK. Then, as per instructions on the https://github.com/1038lab/ComfyUI-OmniGen page, I installed the required Python package. That seemed to work OK, too, However, the model that is supposed to auto download into the models/LLM/OmniGen-v1 folder didn't download. I waited several minutes and nothing happened, So, I created the "OmniGen-v1" folder under the LLM folder manually, then manually downloaded that model and other files from https://huggingface.co/Shitao/OmniGen-v1/tree/main and placed them into the OmniGen-v1 folder.

After all this, I went to ComfyUI, and loaded the "omnigen_t2i_i2i.json" workflow that is in the custom_nodes/ComfyUI-OmniGen folder. The workflow opened OK, but when I did "Queue Prompt" on it, I get several run time errors. Just to be sure I hadn't missed something while doing the installation, I uninstalled and re-installed the ComfyUI-OmniGen node and files a couple of times. Every time I try to execute the workflow, I get these same errors, so at least my problem is consistent.

I can't find any documentation on this problem, so I am posting it here as an Issue. I hope you can help me solve it. I have attached a screen shot of the contents of my models/LLM/OmniGen-v1 folder and the run time error messages that I am seeing in ComfyUI.

Screenshot 2024-11-11 192005

@AI421
Copy link

AI421 commented Nov 13, 2024

more effective response, share the error message in text format instead of a screenshot. This will make it easier for others to assist you.

@kaptaind101
Copy link
Author

Thanks. Here are the error messages in text format:

Prompt outputs failed validation
ailab_OmniGen:

  • Value 0 smaller than min of 1: num_inference_steps
  • Value not in list: model_precision: '1.8' not in ['Auto', 'FP16', 'FP8']
  • Value 0.0 smaller than min of 1.0: img_guidance_scale
  • Value not in list: memory_management: '50' not in ['Balanced', 'Speed Priority', 'Memory Priority']
  • Failed to convert an input value to a INT value: height, fixed, invalid literal for int() with base 10: 'fixed'
    ailab_OmniGen:
  • Value 0 smaller than min of 1: num_inference_steps
  • Value not in list: model_precision: '1.8' not in ['Auto', 'FP16', 'FP8']
  • Value not in list: preset_prompt: 'the girl in image_1 sitting on rock on top of the mountain.' not in (list of length 31)
  • Value 0.0 smaller than min of 1.0: img_guidance_scale
  • Value not in list: memory_management: '50' not in ['Balanced', 'Speed Priority', 'Memory Priority']
  • Failed to convert an input value to a INT value: height, fixed, invalid literal for int() with base 10: 'fixed'

@bbertram99
Copy link

Same error here as well...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants