-
Notifications
You must be signed in to change notification settings - Fork 192
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Good First Issue]: Verify red-pajama-3b-chat with GenAI text_generation #263
Comments
.take |
Thank you for looking into this issue! Please let us know if you have any questions or require any help. |
Hello @tranchung163, are you still working on this? Is there anything we could help you with? |
Hi @p-wysocki, yes i am still working on this. I have tried Convert a model to OpenVINO IR and benchmark for red-pajama model. This is the output I got, it seems fine to me. But I ran into some issues with cmake files [Text_generation] I build greedy_causal_lm <MODEL_DIR> "" I tried cmake -S .\ -B .\build\ && cmake --build .\build\ --config Release -j I am trying to install all packages that are required and fix the issues. Sorry for the late response, I will try figure it out. Thanks |
@pavel-esir, @Wovchena could you please take a look? |
@tranchung163 thanks a lot for your analysis! According to logs looks like Could you please try to make cd to OPENVINO_INSTALL_DIR path and call
For some reason |
Hi @pavel-esir, thank you for your help. I set up |
@tranchung163 thanks for the update. Outputs do not look meaningful. Could you please play around with different questions and what are the results? |
Hi @pavel-esir, there is an issue while running Then I run these commands without having any issue
But I build my model again with ..And I got the same result |
thanks for reporting the issue with installing requirements. It can be because of the M1. I'll check on x86 the versions of auto_gpt as well a bit later. Hmm, outputs again look strange. But i noticed that you are using |
Hi @pavel-esir, Thank you for your feedback. I think Auto-gptq new versions do not support M1 MacOS, so the output did not make sense. After I changed my laptop, I was able to run the test and got expected result. I also switched to model ikala/redpajama-3b-chat I am going to add tests to |
#WLB#+ .take |
Thanks for being interested in this issue. It looks like this ticket is already assigned to a contributor. Please communicate with the assigned contributor to confirm the status of the issue. |
This pull request is focused on expanding the test coverage for the [redpajama-3b-chat ](https://huggingface.co/ikala/redpajama-3b-chat)Large Language Model (LLM) within OpenVINO GenAI. The redpajama-3b model is a significant addition to the supported models list, and testing its functionality. (Issue #263) --------- Co-authored-by: Ilya Lavrenov <[email protected]> Co-authored-by: Zlobin Vladimir <[email protected]>
Context
This task regards enabling tests for red-pajama-3b-chat. You can find more details under openvino_notebooks LLM chatbot README.md.
Please ask general questions in the main issue at #259
What needs to be done?
Described in the main Discussion issue at: #259
Example Pull Requests
Described in the main Discussion issue at: #259
Resources
Contact points
Described in the main Discussion issue at: #259
Ticket
The text was updated successfully, but these errors were encountered: