Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

StaticLLMPipeline: Enable chat test #1117

Open
wants to merge 21 commits into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
f7a63e6
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 17, 2024
f87b049
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 17, 2024
d584e5d
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 18, 2024
66e384c
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 18, 2024
614da55
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 20, 2024
3acec5b
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 21, 2024
2470613
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 22, 2024
e640af3
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 25, 2024
13ce329
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 30, 2024
3f318be
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 30, 2024
fbd14c3
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 31, 2024
d4fd072
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 31, 2024
b3e737c
Enable chat test
TolyaTalamanov Oct 31, 2024
81adab0
Update tests/python_tests/test_llm_pipeline_static.py
andrei-kochin Nov 1, 2024
fb21060
Update test_llm_pipeline_static.py
TolyaTalamanov Nov 1, 2024
2a8a541
Merge branch 'master' into at/static-llm-pipeline-enable-chat-test
ilya-lavrenov Nov 5, 2024
1eccfae
Update test_llm_pipeline_static.py
TolyaTalamanov Dec 24, 2024
e22945c
Merge branch 'master' into at/static-llm-pipeline-enable-chat-test
TolyaTalamanov Dec 24, 2024
f11c96f
Update test_llm_pipeline_static.py
TolyaTalamanov Dec 24, 2024
cc68e28
Update test_llm_pipeline_static.py
TolyaTalamanov Dec 24, 2024
5ed704a
Merge branch 'master' into at/static-llm-pipeline-enable-chat-test
TolyaTalamanov Dec 27, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 7 additions & 7 deletions tests/python_tests/test_llm_pipeline_static.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
import pytest
import sys
from ov_genai_test_utils import (
read_model,
get_models_list,
get_chat_models_list,
)
Expand All @@ -24,7 +25,7 @@
def generate_chat_history(model_path, device, pipeline_config, questions):
pipe = ov_genai.LLMPipeline(model_path, device, **pipeline_config)
pipe.start_chat()
chat_history = [ pipe.generate(question, max_new_tokens=50) for question in questions ]
chat_history = [ pipe.generate(question, max_new_tokens=50, do_sample=False) for question in questions ]
pipe.finish_chat()
return chat_history

Expand Down Expand Up @@ -132,23 +133,22 @@ def test_max_number_of_tokens():
assert len(encoded_results.tokens[0]) == num_tokens


# FIXME: Known problem, output differs from stateful pipeline starting from 3rd prompt!
@pytest.mark.skipif(sys.platform in ["darwin", "linux"], reason="Not supposed to work on mac. Segfault on linux CI")
@pytest.mark.skip(reason="JIRA-144780: Output differs from stateful pipeline")
@pytest.mark.precommit
@pytest.mark.nightly
def test_chat_generation(model_descr):
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

def test_chat_generation():
questions = [
'1+1=',
'What is the previous answer?',
'Why is the Sun yellow?',
'What was my first question?'
]

model_path = get_chat_models_list()[0][1]
model_descr = get_chat_models_list()[0]
model_info = read_model((model_descr[0], model_descr[1] / '_test_chat'), add_special_tokens=False)

chat_history_stateful = generate_chat_history(model_path, "CPU", { }, questions)
chat_history_static = generate_chat_history(model_path, "NPU", common_config, questions)
chat_history_stateful = generate_chat_history(model_info[1], "CPU", { }, questions)
chat_history_static = generate_chat_history(model_info[1], "NPU", common_config, questions)

print('npu chat: \n{chat_history_static}\n')
print('cpu chat: \n{chat_history_stateful}')
Expand Down
Loading