Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug] embedding microservice almost openai compatible but not fully #926

Open
rbrugaro opened this issue Nov 20, 2024 · 8 comments · Fixed by #1035
Open

[bug] embedding microservice almost openai compatible but not fully #926

rbrugaro opened this issue Nov 20, 2024 · 8 comments · Fixed by #1035
Assignees
Labels
bug Something isn't working
Milestone

Comments

@rbrugaro
Copy link
Collaborator

We claim OPENAI compatible but OPEA still returns some objects as null that should be populated.
OPEA TEI: "usage":null, "model":null,
Bare TEI: "model":"BAAI/bge-large-en-v1.5","usage":{"prompt_tokens":12,"total_tokens":12}

This issue of milling objects applies to other microservices too. Might be good to systematically correct for all.

@logan-markewich
Copy link

I think run-llama/llama_index#16666 is also blocked by this. You cannot send a list of texts to an embedding endpoint

@rbrugaro
Copy link
Collaborator Author

rbrugaro commented Dec 5, 2024

@logan-markewich I retested run-llama/llama_index#16666 and is working with list of text input. It was fixed on PR: #892
I think the only piece missing for full openai compatibility are the usage and model values.

@joshuayao joshuayao added the bug Something isn't working label Dec 6, 2024
@lkk12014402
Copy link
Collaborator

@logan-markewich I retested run-llama/llama_index#16666 and is working with list of text input. It was fixed on PR: #892 I think the only piece missing for full openai compatibility are the usage and model values.

hi, we follow this doc https://platform.openai.com/docs/api-reference/embeddings to define the input/output format https://github.com/opea-project/GenAIComps/blob/main/comps/cores/proto/api_protocol.py#L83C1-L107C1. It's true that some values ​​are null because we don't need them.

@rbrugaro
Copy link
Collaborator Author

@lkk12014402 , by "we don't need them" you mean OPEA reference examples don't make use of those values at the moment? since we are integrating OPEA microservices in llama-index, and llama-index users expect openai compatible for diverse use cases and to easily swap with other cloud/prem alternatives in their deployments it would still be good to address the gaps.

@ftian1
Copy link
Collaborator

ftian1 commented Dec 13, 2024

Rita and Logan's concern are valid, we will fix it asap. @lkk12014402 @logan-markewich @rbrugaro

@joshuayao joshuayao added this to OPEA Dec 13, 2024
@joshuayao joshuayao added this to the v1.2 milestone Dec 13, 2024
@joshuayao joshuayao moved this to In progress in OPEA Dec 13, 2024
@lkk12014402
Copy link
Collaborator

@lkk12014402 , by "we don't need them" you mean OPEA reference examples don't make use of those values at the moment? since we are integrating OPEA microservices in llama-index, and llama-index users expect openai compatible for diverse use cases and to easily swap with other cloud/prem alternatives in their deployments it would still be good to address the gaps.

hi, @rbrugaro sorry, we use the wrong tei endpoint /embed, with which we can't get the informations of model and usage. And I will fix it by using /v1/embeddings

@lkk12014402
Copy link
Collaborator

We claim OPENAI compatible but OPEA still returns some objects as null that should be populated. OPEA TEI: "usage":null, "model":null, Bare TEI: "model":"BAAI/bge-large-en-v1.5","usage":{"prompt_tokens":12,"total_tokens":12}

This issue of milling objects applies to other microservices too. Might be good to systematically correct for all.

hi, @rbrugaro I fix the issue with this pr #1035

@joshuayao joshuayao linked a pull request Dec 17, 2024 that will close this issue
@ftian1
Copy link
Collaborator

ftian1 commented Dec 18, 2024

@logan-markewich FYI. pls be aware of this fix made by Kaokao.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: In progress
Development

Successfully merging a pull request may close this issue.

5 participants