Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support chat conversation for StaticLLMPipeline #580

Conversation

TolyaTalamanov
Copy link
Collaborator

@TolyaTalamanov TolyaTalamanov commented Jul 5, 2024

Overview

Adding chat mode support for StaticLLMPipeline.

The current implementation is naive - aggregates the entire chat conversation and pass as new prompt on every new generate call.

@TolyaTalamanov TolyaTalamanov force-pushed the at/support-chat-model-for-static-llm-pipeline branch from 7ba018a to 670b1d6 Compare July 5, 2024 08:15
…genai into at/support-chat-model-for-static-llm-pipeline
src/cpp/src/llm_pipeline_static.cpp Outdated Show resolved Hide resolved
src/cpp/src/llm_pipeline_static.cpp Outdated Show resolved Hide resolved
* Adapt for new start_chat(const string&) signature
* Handle "ignore_eos" config option
@TolyaTalamanov TolyaTalamanov force-pushed the at/support-chat-model-for-static-llm-pipeline branch 2 times, most recently from 91a0871 to 20142e2 Compare July 15, 2024 14:21
@TolyaTalamanov TolyaTalamanov force-pushed the at/support-chat-model-for-static-llm-pipeline branch from 20142e2 to 7d826a3 Compare July 15, 2024 14:26
samples/cpp/chat_sample/chat_sample.cpp Outdated Show resolved Hide resolved
src/cpp/src/llm_pipeline_static.cpp Outdated Show resolved Hide resolved
src/cpp/src/llm_pipeline_static.cpp Show resolved Hide resolved
Copy link
Contributor

@pavel-esir pavel-esir left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think there was a typo, but overall looks good. Approved

src/cpp/src/llm_pipeline_static.cpp Show resolved Hide resolved
src/cpp/src/llm_pipeline_static.cpp Outdated Show resolved Hide resolved
@Wovchena Wovchena added this pull request to the merge queue Jul 17, 2024
@github-merge-queue github-merge-queue bot removed this pull request from the merge queue due to failed status checks Jul 17, 2024
@Wovchena Wovchena added this pull request to the merge queue Jul 17, 2024
Merged via the queue into openvinotoolkit:master with commit 7f5e8d2 Jul 17, 2024
27 checks passed
TolyaTalamanov added a commit to TolyaTalamanov/openvino.genai that referenced this pull request Jul 22, 2024
# Overview

Adding chat mode support for `StaticLLMPipeline`. 

The current implementation is naive - aggregates the entire chat
conversation and pass as new prompt on every new `generate` call.

---------

Co-authored-by: Pavel Esir <[email protected]>
@ilya-lavrenov ilya-lavrenov self-assigned this Jul 31, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants