-
Notifications
You must be signed in to change notification settings - Fork 144
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[bug] embedding microservice almost openai compatible but not fully #926
Comments
I think run-llama/llama_index#16666 is also blocked by this. You cannot send a list of texts to an embedding endpoint |
@logan-markewich I retested run-llama/llama_index#16666 and is working with list of text input. It was fixed on PR: #892 |
hi, we follow this doc https://platform.openai.com/docs/api-reference/embeddings to define the input/output format https://github.com/opea-project/GenAIComps/blob/main/comps/cores/proto/api_protocol.py#L83C1-L107C1. It's true that some values are null because we don't need them. |
@lkk12014402 , by "we don't need them" you mean OPEA reference examples don't make use of those values at the moment? since we are integrating OPEA microservices in llama-index, and llama-index users expect openai compatible for diverse use cases and to easily swap with other cloud/prem alternatives in their deployments it would still be good to address the gaps. |
Rita and Logan's concern are valid, we will fix it asap. @lkk12014402 @logan-markewich @rbrugaro |
hi, @rbrugaro sorry, we use the wrong tei endpoint |
|
@logan-markewich FYI. pls be aware of this fix made by Kaokao. |
We claim OPENAI compatible but OPEA still returns some objects as null that should be populated.
OPEA TEI: "usage":null, "model":null,
Bare TEI: "model":"BAAI/bge-large-en-v1.5","usage":{"prompt_tokens":12,"total_tokens":12}
This issue of milling objects applies to other microservices too. Might be good to systematically correct for all.
The text was updated successfully, but these errors were encountered: