Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[docs] openai ChatCompletion Wrapper {WIP} #135

Closed
wants to merge 2 commits into from
Closed

Conversation

Ankush-lastmile
Copy link
Member

@Ankush-lastmile Ankush-lastmile commented Nov 10, 2023

Ankush Pala [email protected] added 2 commits November 11, 2023 11:42
## What

1. Save outputs
2. Allow passing a file path
3. Allow passing an AIConfig object
4. Persist output with streaming enabled (pass through streaming)

Notes:
- With streaming, we can either capture the response and not return it to the user or use pass through streaming, but, if the stream completion is not fully iterated through, ie the user doesn't touch the completion, completion args doesn't get serialized. Can't do both due to the nature of what it entails
- - chose to go with pass through streaming

- If the completion has a prompt that is already in the config (same input, settings, etc) but different outputs, the outputs are overriden with the new one

- if one completion is streamed, and another is not, those are considered different prompts and get serialized as such

- Added a dependency on nest_asyncio to make this work in ipynb.

## Why

Wrapper needs to be customizable and flexible
@Ankush-lastmile Ankush-lastmile changed the title [docs] openai ChatCompletion Wrapper [docs] openai ChatCompletion Wrapper {WIP} Nov 11, 2023
@Ankush-lastmile Ankush-lastmile marked this pull request as ready for review November 11, 2023 16:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant