Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update vllm_worker.py fix bug #2491 vllm 0.2.0 version from vllm.engine.async_l… #2493

Closed
wants to merge 5 commits into from

Conversation

exceedzhang
Copy link

@exceedzhang exceedzhang commented Sep 29, 2023

…lm_engine import AsyncLLMEngine

Why are these changes needed?

Update vllm_worker.py fix bug #2491 vllm 0.2.0 version from vllm.engine.async_l…

Related issue number (if applicable)

Checks

  • I've run format.sh to lint the changes in this PR.
  • I've included any doc changes needed.
  • I've made sure the relevant tests are passing (if applicable).

@exceedzhang
Copy link
Author

I fixed bugs #2493. It works well!
image

image

fastchat/serve/vllm_worker.py Outdated Show resolved Hide resolved
exceedzhang and others added 4 commits October 4, 2023 19:56
use try except to make it work with both the old version and the new version 0.2.0
@merrymercy
Copy link
Member

I saw this in the latest vllm code https://github.com/vllm-project/vllm/blob/acbed3ef40f015fcf64460e629813922fab90380/vllm/__init__.py#L4
It seems this PR is not necessary

@exceedzhang exceedzhang closed this Oct 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants