-
非常非常棒的项目!
和直接使用如下的vllm OpenAI compatible inference命令:
会有区别吗? |
Beta Was this translation helpful? Give feedback.
Answered by
hiyouga
Apr 24, 2024
Replies: 1 comment
-
我们会把训练时的 template 保存在 tokenizer 中,理论上可以无损迁移到 vllm 引擎,不过最好还是验证一下两者是否会有差异 |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
zuxin666
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
我们会把训练时的 template 保存在 tokenizer 中,理论上可以无损迁移到 vllm 引擎,不过最好还是验证一下两者是否会有差异