Skip to content
This repository has been archived by the owner on Oct 11, 2024. It is now read-only.

feat(llm): add integration for Mistral, Gemini and Claude #169

Closed
wants to merge 1 commit into from
Closed

feat(llm): add integration for Mistral, Gemini and Claude #169

wants to merge 1 commit into from

Conversation

sarthakkapila
Copy link

@sarthakkapila sarthakkapila commented May 28, 2024

Closes #129

Hey! first time contributing to this repo.
Added support for mistral, claude and gemini.
Not really sure about the test file, might need some help.

Thanks

Copy link
Member

@frgfm frgfm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR @sarthakkapila 🙏

Quite ambitious and kind of you to jump into this integration! Two high-level comments:

  1. I think it would be easier to split this and have 1PR / integration
  2. Not sure whether you tested the code or used a coding assistant, but I've spotted a few parts in the comments that, I think, cannot run. Would you mind fixing those?

Let me know what you think!

CLAUDE_API_KEY=
CLAUDE_MODEL='claude-3-opus-20240229'
GEMINI_API_KEY=
GEMINI_MODEL='GeminiPro_1.5_Pro'
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

isn't it "gemini-1.5-pro-latest"? Where did you find this model reference?

anthropic = "^0.26.1"
google-generativeai = "^0.5.4"
mistralai = "^0.2.0"
pytest = "^8.2.1"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pytest is only needed for the tests, and it's already included in the "test" group, would you mind removing it from here please? :)

google-generativeai = "^0.5.4"
mistralai = "^0.2.0"
pytest = "^8.2.1"
mistral = "^18.0.1"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why the dupplicated dep? we only need https://github.com/mistralai/client-python right?


MISTRAL_MODEL: str = os.environ.get("MISTRAL_MODEL", "open-mixtral-8x7b")
CLAUDE_MODEL: str = os.environ.get("CLAUDE_MODEL", "claude-3-opus-20240229")
GEMINI_MODEL: str = os.environ.get("GEMINI_MODEL", "GeminiPro_1.5_Pro")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here, I can't find that model reference in the API documentation

Comment on lines +9 to +11
Opus: str = "claude-3-opus-20240229"
Sonnet: str = "claude-3-sonnet-20240229"
Haiku: str = "claude-3-haiku-20240307"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For the sake of conventions, could you use UPPER_CASE for the class attributes of enums please?

Comment on lines +40 to +53
stream = cast(
self._client.messages.create(
messages=[
{"role": "system", "content": _system}],
model=self.model,
temperature=self.temperature,
max_tokens=2048,
stream=True,
# top_p=1,
# stop=None,
# stream=True,
# stream_options={"include_usage": True},
)
)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I might be missing something but this cannot work: cast requires two positional arguments and the first one being the output type the object will be casted to

Comment on lines +9 to +11
GeminiPro_1_5_Pro: str = "gemini-1.5-pro-latest"
Gemini_1_5_Flash: str = "gemini-1.5-flash-latest"
Gemini_1_0_Pro: str = "gemini-1.0-pro"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here, let's use upper case

# Prepare the request
_system = CHAT_PROMPT if not system else f"{CHAT_PROMPT} {system}"

stream = cast(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here, I don't think that can work 🤔

@@ -0,0 +1 @@
OpenAI
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what's this?

Comment on lines -4 to -5
from groq import AuthenticationError as GAuthError
from groq import NotFoundError as GNotFoundError
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why remove that import? this breaks the groq tests

@frgfm frgfm changed the title Fixes #129 feat(llm): add integration for Mistral, Gemini and Claude May 29, 2024
@sarthakkapila
Copy link
Author

sarthakkapila commented May 30, 2024

@frgfm

  • First of all sorry for the terrible PR before this I haven't worked/contributed in a project with so many inter parts and dependencies and had no idea about testing 😅 as a result used a coding assistant to verify.
  • Decided to go with 1 PR / integration as you recommended.
  • For now i've added Claude feat(llm): add integration for Claude  #171 and will add gemini and mistral soon :)

p.s. thank you so much 🙏 for pointing the mistakes in detail as this helped me a lot.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Let users choose between local/hosted inference & cloud APIs
2 participants