-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
openai BadRequest error #751
Comments
Welcome @Ismael Which LLM would you like to use? You'll want to replace the : with = in your .env For example, the minimalistic .env is:
Replace _________ with the relevant keys |
I'd like to use groq + llama 3 |
Have a look at the env examples here Search the codebase for the alternative env variables related to groq embedding models. I.e. what are the alternatives for: OLLAMA_EMBEDDING_MODEL=all-minilm:22m |
I'm using the docker-compose with this env
I did try to use LLM_PROVIDER: groq but that failed with an error that the openai key wasn't set.
The text was updated successfully, but these errors were encountered: