How to mannually interrupt when outputing answers? #748
Unanswered
NeverOccurs
asked this question in
Q&A
Replies: 1 comment
-
There should be a stop sequence option in Ollama you can configure directly from tha cat, let me check |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi I really love your works. Just a quick question, when I use local ollama models sometimes it produces gibberish and never stops. I have to restart the container and it is quite annoying. How to mannually interrupt a generating process when I don't want it to continue? Many thanks in advance!
Beta Was this translation helpful? Give feedback.
All reactions