-
-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sweep: Add options like temperature to LangChain CLI OpenAI provider command #240
Sweep: Add options like temperature to LangChain CLI OpenAI provider command #240
Conversation
Rollback Files For Sweep
This is an automated message generated by Sweep AI. |
Apply Sweep Rules to your PR?
This is an automated message generated by Sweep AI. |
Important Auto Review SkippedBot user detected. To trigger a single review, invoke the You can disable this status message by setting the Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configration File (
|
PR Feedback (click)
Description
This pull request introduces the ability to specify a sampling temperature when authenticating with the OpenAI provider using the LangChain CLI. This feature allows users to control the randomness of the responses generated by the model, providing a more customizable experience.
Summary
--temperature
to theOpenAiCommand
class, allowing users to specify the sampling temperature for the OpenAI model.temperature
option accepts values between 0 and 2, with a default value of 0.7.HandleAsync
method inOpenAiCommand.cs
to accept thetemperature
parameter.AuthenticateWithApiKeyAsync
method inHelpers.cs
to write thetemperature
value to the settings folder, enabling its use in API requests.src/Cli/src/Commands/Auth/OpenAiCommand.cs
src/Cli/src/Helpers.cs
Fixes #239.
🎉 Latest improvements to Sweep:
💡 To get Sweep to edit this pull request, you can:
This is an automated message generated by Sweep AI.