LLMs support 🤖
Excited for yet another HUGE release that includes LLM support for the following: Llama3, Mistral, Anthropic, HuggingFace, Together AI, Gemini and more!
We've also updated our docs including examples:
- Customizing the research assistant: https://docs.gptr.dev/docs/gpt-researcher/config
- Configuring LLMs: https://docs.gptr.dev/docs/gpt-researcher/llms
This release also includes LangGraph deployment by @hwchase17 and additional stability improvements. Thank you to all the amazing contributions!
What's Changed
- remove DOC_Path from docker-compose - rely on .env instead by @ElishaKay in #509
- Adding one-click deploy button for RepoCloud.io to README.md by @cosark in #515
- Fix Pydantic validation of base_url assignment for ChatOpenAI model by @mmashnev in #512
- Added Instructions for Groq by @dphiggs01 in #536
- Implemented GroqProvider by @dphiggs01 in #526
- add support for custom openai api embeddings by @sebaxzero in #528
- langgraph deploy by @hwchase17 in #537
- Add logo to README by @assafelovic in #540
- added pandas dependency for reading csv, commented testing dependenci… by @ElishaKay in #541
- Adding support for Ollama (both LLM and embeddings) by @gschmutz in #527
New Contributors
- @cosark made their first contribution in #515
- @mmashnev made their first contribution in #512
- @sebaxzero made their first contribution in #528
- @gschmutz made their first contribution in #527
Full Changelog: v0.2.3...v0.2.4