Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

StaticLLMPipeline: Enable chat test #1117

Open
wants to merge 21 commits into
base: master
Choose a base branch
from
Open
Changes from 16 commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
f7a63e6
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 17, 2024
f87b049
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 17, 2024
d584e5d
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 18, 2024
66e384c
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 18, 2024
614da55
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 20, 2024
3acec5b
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 21, 2024
2470613
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 22, 2024
e640af3
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 25, 2024
13ce329
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 30, 2024
3f318be
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 30, 2024
fbd14c3
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 31, 2024
d4fd072
Merge branch 'master' of https://github.com/openvinotoolkit/openvino.…
TolyaTalamanov Oct 31, 2024
b3e737c
Enable chat test
TolyaTalamanov Oct 31, 2024
81adab0
Update tests/python_tests/test_llm_pipeline_static.py
andrei-kochin Nov 1, 2024
fb21060
Update test_llm_pipeline_static.py
TolyaTalamanov Nov 1, 2024
2a8a541
Merge branch 'master' into at/static-llm-pipeline-enable-chat-test
ilya-lavrenov Nov 5, 2024
1eccfae
Update test_llm_pipeline_static.py
TolyaTalamanov Dec 24, 2024
e22945c
Merge branch 'master' into at/static-llm-pipeline-enable-chat-test
TolyaTalamanov Dec 24, 2024
f11c96f
Update test_llm_pipeline_static.py
TolyaTalamanov Dec 24, 2024
cc68e28
Update test_llm_pipeline_static.py
TolyaTalamanov Dec 24, 2024
5ed704a
Merge branch 'master' into at/static-llm-pipeline-enable-chat-test
TolyaTalamanov Dec 27, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 2 additions & 4 deletions tests/python_tests/test_llm_pipeline_static.py
Original file line number Diff line number Diff line change
Expand Up @@ -132,20 +132,18 @@ def test_max_number_of_tokens():
assert len(encoded_results.tokens[0]) == num_tokens


# FIXME: Known problem, output differs from stateful pipeline starting from 3rd prompt!
@pytest.mark.skipif(sys.platform in ["darwin", "linux"], reason="Not supposed to work on mac. Segfault on linux CI")
@pytest.mark.skip(reason="JIRA-144780: Output differs from stateful pipeline")
@pytest.mark.precommit
@pytest.mark.nightly
def test_chat_generation(model_descr):
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

def test_chat_generation():
questions = [
'1+1=',
'What is the previous answer?',
'Why is the Sun yellow?',
'What was my first question?'
]

model_path = get_chat_models_lists()[0][1]
model_path = get_chat_models_list()[0][1]
Copy link
Contributor

@ilya-lavrenov ilya-lavrenov Nov 19, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's not a model path, it's a model_id. E.g. from CI error we can see:

model_path = WindowsPath('Qwen2-0.5B-Instruct'), device = 'CPU'

which means model is not even converted by Optimum

In other places, it's used like:

    pipe = read_model(get_models_list()[0])[4]

where read model converts model and created pipe on to top it.

So, the question is - have you run tests locally? do they even magically pass?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

get_chat_models_list() returns List[tuple(model_id, model_path)]
https://github.com/openvinotoolkit/openvino.genai/blob/master/tests/python_tests/ov_genai_test_utils.py#L106

I'd assume get_chat_models_list()[0][1] is the path of the first model, but probably I'm wrong... But at least for get_models_list() it works fine: https://github.com/openvinotoolkit/openvino.genai/blob/master/tests/python_tests/test_llm_pipeline_static.py#L37

Perhaps that problem is that I need to call read_model explicitly at least once, let me try it.

So, the question is - have you run tests locally? do they even magically pass?

I've used my own list of models available on local setup, so yes, I didn't check if this machinery works with default models


chat_history_stateful = generate_chat_history(model_path, "CPU", { }, questions)
chat_history_static = generate_chat_history(model_path, "NPU", common_config, questions)
Expand Down
Loading