Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Forge support #342

Open
Natans8 opened this issue Aug 18, 2024 · 27 comments
Open

Forge support #342

Natans8 opened this issue Aug 18, 2024 · 27 comments

Comments

@Natans8
Copy link

Natans8 commented Aug 18, 2024

In the recent WebUI Forge updates, Regional Prompted is unable to even load. since there were many fundamental changes to both backend and frontend, including the switch to Gradio 4. Currently, Regional Prompter is the last plugin that holds me back from switching completely to Forge, since Forge seems to be overall better optimized, and supports features like Flux that A1111 does not.

It would be great to adapt, if not to Flux, than at least to keep the existing SD1.5/SDXL capabilities in Forge. Thank you in advance!

Below in the log of WebUI Forge when trying to start with Regional Prompter.

*** Error loading script: attention.py
    Traceback (most recent call last):
      File "C:\Forge\modules\scripts.py", line 525, in load_scripts
        script_module = script_loading.load_module(scriptfile.path)
      File "C:\Forge\modules\script_loading.py", line 13, in load_module
        module_spec.loader.exec_module(module)
      File "<frozen importlib._bootstrap_external>", line 883, in exec_module
      File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
      File "C:\Forge\extensions\sd-webui-regional-prompter\scripts\attention.py", line 3, in <module>
        import ldm.modules.attention as atm
    ModuleNotFoundError: No module named 'ldm'

---
*** Error loading script: latent.py
    Traceback (most recent call last):
      File "C:\Forge\modules\scripts.py", line 525, in load_scripts
        script_module = script_loading.load_module(scriptfile.path)
      File "C:\Forge\modules\script_loading.py", line 13, in load_module
        module_spec.loader.exec_module(module)
      File "<frozen importlib._bootstrap_external>", line 883, in exec_module
      File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
      File "C:\Forge\extensions\sd-webui-regional-prompter\scripts\latent.py", line 11, in <module>
        import scripts.attention as att
      File "C:\Forge\extensions\sd-webui-regional-prompter\scripts\attention.py", line 3, in <module>
        import ldm.modules.attention as atm
    ModuleNotFoundError: No module named 'ldm'

---
*** Error loading script: rp.py
    Traceback (most recent call last):
      File "C:\Forge\modules\scripts.py", line 525, in load_scripts
        script_module = script_loading.load_module(scriptfile.path)
      File "C:\Forge\modules\script_loading.py", line 13, in load_module
        module_spec.loader.exec_module(module)
      File "<frozen importlib._bootstrap_external>", line 883, in exec_module
      File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
      File "C:\Forge\extensions\sd-webui-regional-prompter\scripts\rp.py", line 15, in <module>
        import scripts.attention
      File "C:\Forge\extensions\sd-webui-regional-prompter\scripts\attention.py", line 3, in <module>
        import ldm.modules.attention as atm
    ModuleNotFoundError: No module named 'ldm'
@metapea
Copy link

metapea commented Aug 19, 2024

Regional Prompter (Attention mode) works in reforge still.

@jkw117
Copy link

jkw117 commented Aug 21, 2024

On my new install of the regional prompter the ui fails to load the module ldm.modules at all. And never gets added to the gui..
On the other side of things, I checked and I don't have the LDM python module installed: I tried to install it, but I'm wondering if their's a version requirements mix up between forge and Regional prompter.

Collecting ldm
Using cached ldm-0.1.3.tar.gz (6.1 kB)
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'error'
error: subprocess-exited-with-error

× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [12 lines of output]
Traceback (most recent call last):
File "", line 2, in
File "", line 14, in
File "C:\StabilityMatrix\Packages\Stable Diffusion WebUI Forge\venv\lib\site-packages\setuptools_init_.py", line 8, in
import distutils_hack.override # noqa: F401
File "C:\StabilityMatrix\Packages\Stable Diffusion WebUI Forge\venv\lib\site-packages_distutils_hack\override.py", line 1, in
import('distutils_hack').do_override()
File "C:\StabilityMatrix\Packages\Stable Diffusion WebUI Forge\venv\lib\site-packages_distutils_hack_init
.py", line 70, in do_override
ensure_local_distutils()
File "C:\StabilityMatrix\Packages\Stable Diffusion WebUI Forge\venv\lib\site-packages_distutils_hack_init
.py", line 57, in ensure_local_distutils
assert '_distutils' in core.file, core.file
AssertionError: C:\StabilityMatrix\Packages\Stable Diffusion WebUI Forge\venv\Scripts\python310.zip\distutils\core.pyc
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
Error: StabilityMatrix.Core.Exceptions.ProcessException: Process python failed with exit-code 1.
at StabilityMatrix.Core.Processes.ProcessRunner.WaitForExitConditionAsync(Process process, Int32 expectedExitCode, CancellationToken cancelToken)
at StabilityMatrix.Core.Models.PackageModification.PipStep.ExecuteAsync(IProgress1 progress) at StabilityMatrix.Core.Models.PackageModification.PipStep.ExecuteAsync(IProgress1 progress)
at StabilityMatrix.Core.Models.PackageModification.PackageModificationRunner.ExecuteSteps(IReadOnlyList`1 steps)

@steamrick
Copy link

Can confirm, I get the same error

File "E:\AI Art Stuff\forge\webui\extensions\sd-webui-regional-prompter\scripts\attention.py", line 3, in
import ldm.modules.attention as atm
ModuleNotFoundError: No module named 'ldm'

@githabtane
Copy link

数日前のアップデートで他の拡張機能も
同じ種類のエラーが出ています
gradio 4 forgeに対応予定でしょうか?
マスクモードは特に気に入っていました

@ToolB0xx
Copy link

yeah also broken for me after updating forge, hope it works soon

@chen079
Copy link

chen079 commented Aug 24, 2024

try to copy the 'ldm' folder in "..\stable-diffusion-webui-forge\repositories\stable-diffusion-stability-ai" to the root "..\stable-diffusion-webui-forge" could solve this problem.
I use this method to run regional prompt on the latest Stable Diffusion Forge successfully.

@chen079
Copy link

chen079 commented Aug 24, 2024

try to copy the 'ldm' folder in "..\stable-diffusion-webui-forge\repositories\stable-diffusion-stability-ai" to the root "..\stable-diffusion-webui-forge" could solve this problem. I use this method to run regional prompt on the latest Stable Diffusion Forge successfully.

but when using XL model, this error has happened:
*** Error running postprocess: F:\AIDraw\stable-diffusion-webui-forge\extensions\sd-webui-regional-prompter\scripts\rp.py
Traceback (most recent call last):
File "F:\AIDraw\stable-diffusion-webui-forge\modules\scripts.py", line 900, in postprocess
script.postprocess(p, processed, *script_args)
File "F:\AIDraw\stable-diffusion-webui-forge\extensions\sd-webui-regional-prompter\scripts\rp.py", line 600, in postprocess
unloader(self, p)
File "F:\AIDraw\stable-diffusion-webui-forge\extensions\sd-webui-regional-prompter\scripts\rp.py", line 621, in unloader
unloadlorafowards(p)
File "F:\AIDraw\stable-diffusion-webui-forge\extensions\sd-webui-regional-prompter\scripts\latent.py", line 573, in unloadlorafowards
emb_db = sd_hijack.model_hijack.embedding_db
AttributeError: 'StableDiffusionModelHijack' object has no attribute 'embedding_db'


1,1 0.2 Horizontal
*** Error running process: F:\AIDraw\stable-diffusion-webui-forge\extensions\sd-webui-regional-prompter\scripts\rp.py
Traceback (most recent call last):
File "F:\AIDraw\stable-diffusion-webui-forge\modules\scripts.py", line 844, in process
script.process(p, *script_args)
File "F:\AIDraw\stable-diffusion-webui-forge\extensions\sd-webui-regional-prompter\scripts\rp.py", line 502, in process
self.handle = hook_forwards(self, p.sd_model.model.diffusion_model)
AttributeError: 'StableDiffusionXL' object has no attribute 'model'


@fishonainternet
Copy link

Just use the older version of ForgeUI, it works

@vitrvvivs
Copy link

This is not the correct way to fix it, as it will propably break non-forge setups (I didn't test that), but it is the way I found to get it working again.

diff --git a/scripts/latent.py b/scripts/latent.py
index 7d0dcc3..de60abb 100644
--- a/scripts/latent.py
+++ b/scripts/latent.py
@@ -570,7 +570,8 @@ def unloadlorafowards(p):
     except:
         pass

-    emb_db = sd_hijack.model_hijack.embedding_db
+    from modules import ui_extra_networks_textual_inversion
+    emb_db = ui_extra_networks_textual_inversion.embedding_db
     import lora
     for net in lora.loaded_loras:
         if hasattr(net,"bundle_embeddings"):
diff --git a/scripts/rp.py b/scripts/rp.py
index 93b4466..34272f2 100644
--- a/scripts/rp.py
+++ b/scripts/rp.py
@@ -499,14 +499,14 @@ class Script(modules.scripts.Script):

             ##### calcmode
             if "Att" in calcmode:
-                self.handle = hook_forwards(self, p.sd_model.model.diffusion_model)
+                self.handle = hook_forwards(self, p.sd_model.forge_objects.unet.model.diffusion_model)
                 if hasattr(shared.opts,"batch_cond_uncond"):
                     shared.opts.batch_cond_uncond = orig_batch_cond_uncond
                 else:
                     shared.batch_cond_uncond = orig_batch_cond_uncond
                 unloadlorafowards(p)
             else:
-                self.handle = hook_forwards(self, p.sd_model.model.diffusion_model,remove = True)
+                self.handle = hook_forwards(self, p.sd_model.forge_objects.unet.model.diffusion_model,remove = True)
                 setuploras(self)
                 # SBM It is vital to use local activation because callback registration is permanent,
                 # and there are multiple script instances (txt2img / img2img).
@@ -514,7 +514,7 @@ class Script(modules.scripts.Script):
         elif "Pro" in self.mode: #Prompt mode use both calcmode
             self.ex = "Ex" in self.mode
             if not usebase : bratios = "0"
-            self.handle = hook_forwards(self, p.sd_model.model.diffusion_model)
+            self.handle = hook_forwards(self, p.sd_model.forge_objects.unet.model.diffusion_model)
             denoiserdealer(self)

         neighbor(self,p)                                                    #detect other extention
@@ -608,7 +608,7 @@ class Script(modules.scripts.Script):
 def unloader(self,p):
     if hasattr(self,"handle"):
         #print("unloaded")
-        hook_forwards(self, p.sd_model.model.diffusion_model, remove=True)
+        hook_forwards(self, p.sd_model.forge_objects.unet.model.diffusion_model, remove=True)
         del self.handle

     self.__init__()
@@ -711,7 +711,7 @@ def tokendealer(self, p):

     padd = 0

-    tokenizer = shared.sd_model.conditioner.embedders[0].tokenize_line if self.isxl else shared.sd_model.cond_stage_model.tokenize_line
+    tokenizer = shared.sd_model.conditioner.embedders[0].tokenize_line if self.isxl else shared.sd_model.text_processing_engine_l.tokenize_line

     for pp in ppl:
         tokens, tokensnum = tokenizer(pp)

@queenofinvidia
Copy link

This is not the correct way to fix it, as it will propably break non-forge setups (I didn't test that), but it is the way I found to get it working again.

This is awesome! can you please explain what I do with this? as a non code savvy person?

Thank you <3

@BrianAllred
Copy link

BrianAllred commented Aug 28, 2024

@vitrvvivs neither git apply fix.diff nor patch -i fix.diff work for me. Both say the patch does not apply.

image

Edit: I should also say I made the changes manually, and it still didn't work. Forge gave similar error messages.

@vitrvvivs
Copy link

vitrvvivs commented Aug 28, 2024

@BrianAllred shoot, I think that's whitespace errors. The repo uses CRLF and has trailing spaces all over the place. While your editor probably uses LF, and github stripped all the trailing spaces from the comments. Both of which cause the patch to fail to match.

git config core.whitespace cr-at-eol
git apply --whitespace=fix fix.diff

I should mention I also used @chen079 's suggestion

stable-diffusion-webui$  cp -r repositories/stable-diffusion-stability-ai/ldm ./

@Tarkian10
Copy link

I'm absolutely disgusted with the Forge update, which has tons of issues, including breaking Regional Prompter. Is there a temporary alternative or a similar extension?

Going back to A1111 and its slowness unfortunately seems like the only solution...

@Natans8
Copy link
Author

Natans8 commented Sep 2, 2024

I'm absolutely disgusted with the Forge update, which has tons of issues, including breaking Regional Prompter. Is there a temporary alternative or a similar extension?

Going back to A1111 and its slowness unfortunately seems like the only solution...

You could just use an older version of Forge, iirc it's explained in the README.

@MoeMonsuta
Copy link

I'm absolutely disgusted with the Forge update, which has tons of issues, including breaking Regional Prompter. Is there a temporary alternative or a similar extension?
Going back to A1111 and its slowness unfortunately seems like the only solution...

You could just use an older version of Forge, iirc it's explained in the README.

I'm absolutely disgusted with the Forge update, which has tons of issues, including breaking Regional Prompter. Is there a temporary alternative or a similar extension?

Going back to A1111 and its slowness unfortunately seems like the only solution...

I heard https://github.com/Panchovix/stable-diffusion-webui-reForge is a good solution for quickly swapping versions.

@Tarkian10
Copy link

Thank you for your answers! I don't know why I didn't think of it, but it now seems obvious to use the previous July version. I'm relieved, I’ll be able to resume my work.

@derfasthirnlosenick
Copy link

derfasthirnlosenick commented Sep 7, 2024

@BrianAllred shoot, I think that's whitespace errors. The repo uses CRLF and has trailing spaces all over the place. While your editor probably uses LF, and github stripped all the trailing spaces from the comments. Both of which cause the patch to fail to match.

git config core.whitespace cr-at-eol
git apply --whitespace=fix fix.diff

I should mention I also used @chen079 's suggestion

stable-diffusion-webui$  cp -r repositories/stable-diffusion-stability-ai/ldm ./

(edited)
funny thing. the prompting is all weird now, but this one works:
'1girl in garden, cowboy shot ADDCOMM
green hair twintail ADDROW BREAK
blue blouse ADDROW BREAK
red skirt'

but I cant switch it over to column mode and if I remove the breaks, everything falls apart.

@maniac-0s
Copy link

maniac-0s commented Sep 7, 2024

The problem is that ldm is not available in Forge when you install it. I solved it with installing ldm manually in the forge environment:

lllyasviel/stable-diffusion-webui-forge#1407 (comment)

Edit: Forge has altered a class StableDiffusionModelHijack which in A1111 has an embedding_db attribute but that does not exist in Forge.

regional-prompter uses this to loop through the Loras and Forge has completely disabled the StableDiffusionModelHijack class, boldly replacing all its methods with pass. So it seems there is currently no chance to get it running at all.

@mperez96
Copy link

Got the same error, just commenting to promote the solution of this, will be awesome to have the possibility to use it with flux but we'll see.

Couldn't make any of the solutions mentioned before to work (didn't try to go back the version, but I honestly prefer having flux than regional prompter for now as I can use it in A1111 I guess)

`*** Error loading script: attention.py
Traceback (most recent call last):
File "D:\Users\USER\Documents\AI\webui_forge_cu121_torch231\webui\modules\scripts.py", line 525, in load_scripts
script_module = script_loading.load_module(scriptfile.path)
File "D:\Users\USER\Documents\AI\webui_forge_cu121_torch231\webui\modules\script_loading.py", line 13, in load_module
module_spec.loader.exec_module(module)
File "", line 883, in exec_module
File "", line 241, in _call_with_frames_removed
File "D:\Users\USER\Documents\AI\webui_forge_cu121_torch231\webui\extensions\sd-webui-regional-prompter\scripts\attention.py", line 3, in
import ldm.modules.attention as atm
ModuleNotFoundError: No module named 'ldm'


*** Error loading script: latent.py
Traceback (most recent call last):
File "D:\Users\USER\Documents\AI\webui_forge_cu121_torch231\webui\modules\scripts.py", line 525, in load_scripts
script_module = script_loading.load_module(scriptfile.path)
File "D:\Users\USER\Documents\AI\webui_forge_cu121_torch231\webui\modules\script_loading.py", line 13, in load_module
module_spec.loader.exec_module(module)
File "", line 883, in exec_module
File "", line 241, in _call_with_frames_removed
File "D:\Users\USER\Documents\AI\webui_forge_cu121_torch231\webui\extensions\sd-webui-regional-prompter\scripts\latent.py", line 11, in
import scripts.attention as att
File "D:\Users\USER\Documents\AI\webui_forge_cu121_torch231\webui\extensions\sd-webui-regional-prompter\scripts\attention.py", line 3, in
import ldm.modules.attention as atm
ModuleNotFoundError: No module named 'ldm'


*** Error loading script: rp.py
Traceback (most recent call last):
File "D:\Users\USER\Documents\AI\webui_forge_cu121_torch231\webui\modules\scripts.py", line 525, in load_scripts
script_module = script_loading.load_module(scriptfile.path)
File "D:\Users\USER\Documents\AI\webui_forge_cu121_torch231\webui\modules\script_loading.py", line 13, in load_module
module_spec.loader.exec_module(module)
File "", line 883, in exec_module
File "", line 241, in _call_with_frames_removed
File "D:\Users\USER\Documents\AI\webui_forge_cu121_torch231\webui\extensions\sd-webui-regional-prompter\scripts\rp.py", line 15, in
import scripts.attention
File "D:\Users\USER\Documents\AI\webui_forge_cu121_torch231\webui\extensions\sd-webui-regional-prompter\scripts\attention.py", line 3, in
import ldm.modules.attention as atm
ModuleNotFoundError: No module named 'ldm'`

@rsigim
Copy link

rsigim commented Sep 20, 2024

this fork is working with recent forge

@steamrick
Copy link

this fork is working with recent forge

I can confirm it works (and installing it lets the main regional prompter show up though not work), but only with SDXL (and presumably SD1.5). I couldn't get it to do any regional functionality with a Flux model, unfortunately.

@DrJimmyBrungus
Copy link

this fork is working with recent forge

Can also confirm this is working on Forrge with PonyXL models / LORAs. Nice work,

@queenofinvidia
Copy link

Does the masking feature work too? I'm still getting an error

@titlestad
Copy link

this fork is working with recent forge

No it doesn't work. It has an error (pt) as written elsewhere. The fork is useless with the latest forge - which is a shame.

@derfasthirnlosenick
Copy link

this fork is working with recent forge

No it doesn't work. It has an error (pt) as written elsewhere. The fork is useless with the latest forge - which is a shame.

Is this recent? BC I got it working a while back, but haven't tried it since a few days.

@derfasthirnlosenick
Copy link

Checked and it works. It randomly breaks for me after messing with prompting too much, but overall at least column mode with common positive and negative works.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests