Replies: 2 comments 1 reply
-
In the help for parameters it says this:
I don't see the "min length" option currently in the parameters even with "all model loaders" option set. Maybe it doesn't get shown now or it depends on the AI model, but it recommends to use the "ban the eos_token" option. You could also try putting in the prompt to ask it to generate a long response, eg. "write a long reply to this message responding to each point...". Also make sure your "max new tokens" in the parameters tab is high enough. |
Beta Was this translation helpful? Give feedback.
-
Also curious |
Beta Was this translation helpful? Give feedback.
-
I am still very new to this, and I am playing around with different models and etc, but I have noticed that my responses are always extremely short (not longer than 200 words).
Does the length depend on the model I use, and if yes what are models you could recommend that write long responses.
Beta Was this translation helpful? Give feedback.
All reactions