-
-
Notifications
You must be signed in to change notification settings - Fork 135
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Forge support #342
Comments
Regional Prompter (Attention mode) works in reforge still. |
On my new install of the regional prompter the ui fails to load the module ldm.modules at all. And never gets added to the gui.. Collecting ldm × python setup.py egg_info did not run successfully. note: This error originates from a subprocess, and is likely not a problem with pip. × Encountered error while generating package metadata. note: This is an issue with the package mentioned above, not pip. |
Can confirm, I get the same error File "E:\AI Art Stuff\forge\webui\extensions\sd-webui-regional-prompter\scripts\attention.py", line 3, in |
数日前のアップデートで他の拡張機能も |
yeah also broken for me after updating forge, hope it works soon |
try to copy the 'ldm' folder in "..\stable-diffusion-webui-forge\repositories\stable-diffusion-stability-ai" to the root "..\stable-diffusion-webui-forge" could solve this problem. |
but when using XL model, this error has happened: 1,1 0.2 Horizontal |
Just use the older version of ForgeUI, it works |
This is not the correct way to fix it, as it will propably break non-forge setups (I didn't test that), but it is the way I found to get it working again. diff --git a/scripts/latent.py b/scripts/latent.py
index 7d0dcc3..de60abb 100644
--- a/scripts/latent.py
+++ b/scripts/latent.py
@@ -570,7 +570,8 @@ def unloadlorafowards(p):
except:
pass
- emb_db = sd_hijack.model_hijack.embedding_db
+ from modules import ui_extra_networks_textual_inversion
+ emb_db = ui_extra_networks_textual_inversion.embedding_db
import lora
for net in lora.loaded_loras:
if hasattr(net,"bundle_embeddings"):
diff --git a/scripts/rp.py b/scripts/rp.py
index 93b4466..34272f2 100644
--- a/scripts/rp.py
+++ b/scripts/rp.py
@@ -499,14 +499,14 @@ class Script(modules.scripts.Script):
##### calcmode
if "Att" in calcmode:
- self.handle = hook_forwards(self, p.sd_model.model.diffusion_model)
+ self.handle = hook_forwards(self, p.sd_model.forge_objects.unet.model.diffusion_model)
if hasattr(shared.opts,"batch_cond_uncond"):
shared.opts.batch_cond_uncond = orig_batch_cond_uncond
else:
shared.batch_cond_uncond = orig_batch_cond_uncond
unloadlorafowards(p)
else:
- self.handle = hook_forwards(self, p.sd_model.model.diffusion_model,remove = True)
+ self.handle = hook_forwards(self, p.sd_model.forge_objects.unet.model.diffusion_model,remove = True)
setuploras(self)
# SBM It is vital to use local activation because callback registration is permanent,
# and there are multiple script instances (txt2img / img2img).
@@ -514,7 +514,7 @@ class Script(modules.scripts.Script):
elif "Pro" in self.mode: #Prompt mode use both calcmode
self.ex = "Ex" in self.mode
if not usebase : bratios = "0"
- self.handle = hook_forwards(self, p.sd_model.model.diffusion_model)
+ self.handle = hook_forwards(self, p.sd_model.forge_objects.unet.model.diffusion_model)
denoiserdealer(self)
neighbor(self,p) #detect other extention
@@ -608,7 +608,7 @@ class Script(modules.scripts.Script):
def unloader(self,p):
if hasattr(self,"handle"):
#print("unloaded")
- hook_forwards(self, p.sd_model.model.diffusion_model, remove=True)
+ hook_forwards(self, p.sd_model.forge_objects.unet.model.diffusion_model, remove=True)
del self.handle
self.__init__()
@@ -711,7 +711,7 @@ def tokendealer(self, p):
padd = 0
- tokenizer = shared.sd_model.conditioner.embedders[0].tokenize_line if self.isxl else shared.sd_model.cond_stage_model.tokenize_line
+ tokenizer = shared.sd_model.conditioner.embedders[0].tokenize_line if self.isxl else shared.sd_model.text_processing_engine_l.tokenize_line
for pp in ppl:
tokens, tokensnum = tokenizer(pp) |
This is awesome! can you please explain what I do with this? as a non code savvy person? Thank you <3 |
@vitrvvivs neither Edit: I should also say I made the changes manually, and it still didn't work. Forge gave similar error messages. |
@BrianAllred shoot, I think that's whitespace errors. The repo uses CRLF and has trailing spaces all over the place. While your editor probably uses LF, and github stripped all the trailing spaces from the comments. Both of which cause the patch to fail to match.
I should mention I also used @chen079 's suggestion
|
I'm absolutely disgusted with the Forge update, which has tons of issues, including breaking Regional Prompter. Is there a temporary alternative or a similar extension? Going back to A1111 and its slowness unfortunately seems like the only solution... |
You could just use an older version of Forge, iirc it's explained in the README. |
I heard https://github.com/Panchovix/stable-diffusion-webui-reForge is a good solution for quickly swapping versions. |
Thank you for your answers! I don't know why I didn't think of it, but it now seems obvious to use the previous July version. I'm relieved, I’ll be able to resume my work. |
(edited) but I cant switch it over to column mode and if I remove the breaks, everything falls apart. |
The problem is that ldm is not available in Forge when you install it. I solved it with installing ldm manually in the forge environment: lllyasviel/stable-diffusion-webui-forge#1407 (comment) Edit: Forge has altered a class regional-prompter uses this to loop through the Loras and Forge has completely disabled the |
Got the same error, just commenting to promote the solution of this, will be awesome to have the possibility to use it with flux but we'll see. Couldn't make any of the solutions mentioned before to work (didn't try to go back the version, but I honestly prefer having flux than regional prompter for now as I can use it in A1111 I guess) `*** Error loading script: attention.py *** Error loading script: latent.py *** Error loading script: rp.py |
I can confirm it works (and installing it lets the main regional prompter show up though not work), but only with SDXL (and presumably SD1.5). I couldn't get it to do any regional functionality with a Flux model, unfortunately. |
Can also confirm this is working on Forrge with PonyXL models / LORAs. Nice work, |
Does the masking feature work too? I'm still getting an error |
No it doesn't work. It has an error (pt) as written elsewhere. The fork is useless with the latest forge - which is a shame. |
Is this recent? BC I got it working a while back, but haven't tried it since a few days. |
Checked and it works. It randomly breaks for me after messing with prompting too much, but overall at least column mode with common positive and negative works. |
In the recent WebUI Forge updates, Regional Prompted is unable to even load. since there were many fundamental changes to both backend and frontend, including the switch to Gradio 4. Currently, Regional Prompter is the last plugin that holds me back from switching completely to Forge, since Forge seems to be overall better optimized, and supports features like Flux that A1111 does not.
It would be great to adapt, if not to Flux, than at least to keep the existing SD1.5/SDXL capabilities in Forge. Thank you in advance!
Below in the log of WebUI Forge when trying to start with Regional Prompter.
The text was updated successfully, but these errors were encountered: