-
-
Notifications
You must be signed in to change notification settings - Fork 108
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Stable Lora args missing in API calls #208
Comments
Hi. I'm unable to resolve this at the moment, but I think I could lead you in the right direction (I haven't looked at it in full). I don't know if it hooks into the current UI process through Gradio, or strictly does a POST directly to FastAPI. I had a look in the API code, and it seems that the instance and/or args isn't passed to the API endpoint. You would have to do an import something along the lines of: from stable_lora.scripts.lora_webui import StableLoraScriptInstance
StableLoraScript = StableLoraScriptInstance Here is how the script process works: # You would use the default args (as a list) + stable_lora_ui.
stable_lora_ui = StableLoraScript.ui()
stable_lora_processor = StableLoraScriptInstance
# extra_args = args + stable_lora_ui
stable_lora_args = stable_lora_processor.process_extension_args(all_args=extra_args)
# pipe = ModelScope model (pipe.unet, pipe.text_encoder, etc.)
stable_lora_processor.process(pipe, *stable_lora_args) The script relies on Gradio to process the args, which is why I can't give an assured fix at the moment for the API without looking into it. |
I will try looking into that this week. Thank you. I want to start testing the Stable Lora stuff, but will need to find a fix for the API first. |
I have the same issue while sending a request via API. I'm also looking into it. I'm running the webui on an A40. Here are my logs:
|
Any update on this. I'm unable to resolve the issue. |
We are also looking for a fix for this issue. Earlier it was working for us. So, I tried using a older version of the extension. That didnt work even on the ui side. |
Hey guys, I encountered this bug which prevented me from being able to run the extension and I determined a temporary hack to get around it. I commented out lines 68-70 in process_modelscope.py so it looks like this: #TODO Wrap this in a list so that we can process this for future extensions. As far as I can tell this disables the processing of the LORA model which isn't ideal but seems to allow the extension to continue working. Obviously would be great to be able to keep using the LORAs module but this seems to work for the time being. Hopefully this helps someone. |
Is there an existing issue for this?
Are you using the latest version of the extension?
What happened?
It seems like the new Stable Lora args causes errors with the API calls when the render.py is calling process_modelscope. I just tested on a fully fresh install of both a1111 and extension and I get the same error using the API as I normally do, and with the example call in the local docs.
At first, I thought it could be fixed similar to #192 in scripts/t2vhelpers/render.py by separating out the lora args. But after a few attempts of some fixes, I am unsure how to fix or separate the correct args or where to set defaults for them in order to allow the API to work as it did before. There is a very good chance I am missing something obvious and will be looking at this again in the morning.
It is working as expected in the webui.
error message on the API call -
Traceback (most recent call last):
File "F:\SD-Master\7.5\stable-diffusion-webui/extensions/sd-webui-text2video/scripts\t2v_helpers\render.py", line 30, in run
vids_pack = process_modelscope(args_dict, args)
File "F:\SD-Master\7.5\stable-diffusion-webui/extensions/sd-webui-text2video/scripts\modelscope\process_modelscope.py", line 70, in process_modelscope
stable_lora_processor.process(pipe, *stable_lora_args)
File "F:\SD-Master\7.5\stable-diffusion-webui\venv\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
TypeError: StableLoraScript.process() missing 8 required positional arguments: 'lora_files_selection', 'lora_alpha', 'use_bias', 'use_linear', 'use_conv', 'use_emb', 'use_multiplier', and 'use_time'
Exception occurred: StableLoraScript.process() missing 8 required positional arguments: 'lora_files_selection', 'lora_alpha', 'use_bias', 'use_linear', 'use_conv', 'use_emb', 'use_multiplier', and 'use_time'
Steps to reproduce the problem
What should have happened?
API working as expected.
WebUI and Deforum extension Commit IDs
webui commit id - f865d3e11647dfd6c7b2cdf90dde24680e58acd8
txt2vid commit id - 8f0af8c
Torch version
torch: 2.0.1+cu118
xformers: 0.0.20
python: 3.10.9
What GPU were you using for launching?
4090
On which platform are you launching the webui backend with the extension?
Local PC setup (Windows)
Settings
{'prompt': 'A vibrant cityscape drawn with pastels, flowing paint transforming the scene', 'n_prompt': '(cutscene), (low quality, VHS, artifacts), iphone, tex', 'sampler': 'UniPC', 'model': 'zs2_448w', 'steps': 24, 'frames': 60, 'seed': 2196745325, 'cfg_scale': 14, 'width': 448, 'height': 256, 'eta': 0.0, 'fps': 24, 'batch_count': 1, 'do_vid2vid': False, 'strength': None, 'vid2vid_startFrame': None, 'inpainting_frames': None, 'inpainting_weights': None, 'add_soundtrack': None}
also same error with
{'prompt': 'A vibrant cityscape drawn with pastels, flowing paint transforming the scene', 'n_prompt': '(cutscene), (low quality, VHS, artifacts), iphone, tex', 'sampler': 'UniPC', 'model': 'zs2_448w', 'steps': 24, 'frames': 60, 'seed': 3579263276, 'cfg_scale': 14, 'width': 448, 'height': 256, 'eta': 0.0, 'fps': 24, 'batch_count': 1, 'do_vid2vid': False, 'strength': None, 'vid2vid_startFrame': None, 'inpainting_frames': None, 'inpainting_weights': None, 'add_soundtrack': None, 'lora_files_selection': None, 'lora_alpha': 0.0, 'use_bias': False, 'use_linear': False, 'use_conv': False, 'use_emb': False, 'use_multiplier': False, 'use_time': False}
Console logs
Additional information
I did attempt to fill in these args with a few values on my side to see if I could get the error to go away, but nothing that I tried seemed to work or change the error code. I was alerted of this by someone else also using the API, so I know its not something with my setup.
The text was updated successfully, but these errors were encountered: