You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I can't select a different local model, e.g. "LM Studio Community/Meta-Llama-3-8B-Instruct-GGUF" when I choose LM-Studio on the CLI at start time. It defaults to "gpt-4".
I tried using a CLI flag. It didn't seem to respect the model flag
Describe the bug
I can't select a different local model, e.g. "LM Studio Community/Meta-Llama-3-8B-Instruct-GGUF" when I choose LM-Studio on the CLI at start time. It defaults to "gpt-4".
I tried using a CLI flag. It didn't seem to respect the model flag
I think an option needs to be added here:
https://github.com/OpenInterpreter/01/blob/main/software/source/server/utils/local_mode.py
To Reproduce
Steps to reproduce the behavior:
Expected behavior
I would like to choose a model or type a model on the command line.
Desktop (please complete the following information):
Mac M2
The text was updated successfully, but these errors were encountered: