-
Notifications
You must be signed in to change notification settings - Fork 45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Selected OpenRouter model not persisted #78
Comments
Currently model selection for custom endpoints is not saved - in many backends, the model list is fetched from an external parties and subject to frequent changes. |
It's a consideration in future. For now, the model will need to be manually reselected |
I found this issue wanting to share an openrouter chat I had with a therapist in a way they could access. I found lite.koboldai.net looking for how to do this. It would be helpful if the AI backend and the model selection were persisted with saved json files. Things that could make this easier would be:
Additionally, openrouter.ai has a "general" model As well, lite.koboldai.net looks like a better service for chatting than openrouter. It could be helpful to have a little onboarding information, for example to help people migrate to what is normal to use here instead of a system prompt. (edit: all the new openai-compatible services autoformat system, user, and assistant messages using per-model templates serverside. but the intent of this reply is to support letting users persist AI backend settings they have selected.) |
My two cents on the model thing, i think this would cause issues especially if it would cause the local KoboldCpp bundled Lite to suddenly connect to cloud instances. I'm not opposed to it but would have to be opt in. As for the info, in our UI the memory field in the context menu is raw text so you can format things like your system instructions (If you need them, a lot of models don't for default behavior) and some example turns in there exactly as you would like. Ours has placeholders so {{[INPUT]}} and {{[OUTPUT]}} or even {{[SYSTEM]}} can be written there and it will automatically be replaced with what you have in the settings. In our scenarios menu we have a lot of examples. The exact formatting is task and model specific so its hard to write unified info for. |
When selecting OpenRouter as AI provider, the selected model isn't persisted across reloads. After clicking "Yes" on the Custom Endpoint Reconnect modal, it will always falls back to the default "mistralai/mistral-7b-instruct" model. The OpenRouter key is persisted however, I'd expect the same for the model choice.
The text was updated successfully, but these errors were encountered: