Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Good First Issue]: Verify youri-7b-chat with GenAI text_generation #272

Open
p-wysocki opened this issue Mar 1, 2024 · 22 comments
Open
Assignees
Labels
good first issue Good for newcomers

Comments

@p-wysocki
Copy link
Collaborator

Context

This task regards enabling tests for youri-7b-chat. You can find more details under openvino_notebooks LLM chatbot README.md.

Please ask general questions in the main issue at #259

What needs to be done?

Described in the main Discussion issue at: #259

Example Pull Requests

Described in the main Discussion issue at: #259

Resources

Contact points

Described in the main Discussion issue at: #259

Ticket

No response

@RitikaxShakya
Copy link

.take

Copy link

github-actions bot commented Mar 7, 2024

Thank you for looking into this issue! Please let us know if you have any questions or require any help.

@p-wysocki p-wysocki moved this from Contributors Needed to Assigned in Good first issues Mar 7, 2024
@RitikaxShakya
Copy link

Hello! @p-wysocki I am these errors while downloading and convert the model and tokenizers, during the process, warnings related to TracerWarning appear. These warnings indicate potential issues with converting tensors to Python boolean values, which might cause inaccuracies in the tracing process.
The script tries to send some events, presumably for logging or monitoring purposes. However, it encounters errors related to SSL connections and HTTP errors. These errors seem to be related to network connectivity or server-side issues. I made sure that all dependencies are up to date. Is there anything I am missing maybe ? Please guide me and help me regarding this. Thank you!
image
image

@p-wysocki
Copy link
Collaborator Author

cc @pavel-esir

@Wovchena
Copy link
Collaborator

I never encountered this. But it feels it's your SSL set up issue. If searching doesn't help I'd propose to try a different machine. Maybe it's your firewall or something.

@mlukasze mlukasze moved this from Assigned to Contributors Needed in Good first issues Sep 18, 2024
@jgyasu
Copy link

jgyasu commented Dec 7, 2024

.take

Copy link

github-actions bot commented Dec 7, 2024

Thank you for looking into this issue! Please let us know if you have any questions or require any help.

@jgyasu
Copy link

jgyasu commented Dec 10, 2024

Hi@p-wysocki, I was looking at the #259 and I am trying to understand what's needed to be done here. The gist is that I have to verify this LLM and add it to a sort of list in text_generation/causal_lm/cpp/README.md? But I can't find any text_generation directories in the openvino.genai repo

@Wovchena
Copy link
Collaborator

Wovchena commented Dec 11, 2024

The samples were regrouped. They are here now: https://github.com/openvinotoolkit/openvino.genai/tree/master/samples/cpp

  1. beam_search_causal_lm
  2. benchmark_genai
  3. chat_sample
  4. greedy_causal_lm
  5. multinomial_causal_lm
  6. prompt_lookup_decoding_lm
  7. speculative_decoding_lm

@ShubhamSachdeva311205
Copy link

.take

Copy link

Thanks for being interested in this issue. It looks like this ticket is already assigned to a contributor. Please communicate with the assigned contributor to confirm the status of the issue.

@ShubhamSachdeva311205
Copy link

@mentors i am new to open source, and this is my first one so please bear with me if my doubt is really silly

so i built openvino successfully and when i tried to install dependencies from here

i get this error
and i have boiled down the cause of this error, it is thransformers-steam-generator, i tried to search up a little on this and tried to use chatgpt here, chatgpt suggestied me to install jstyleson, which also gives this error, im not sure what is causing this error and or how to fix this, please mods if you could help that would be really great

im on macos (sequoia 15.2)
some posts also told me that it might be because im using an unsupported python version so my python version is 3.11.2
and yeah idts i have seen anything else related to this on the internet

error:

Image

@jgyasu jgyasu removed their assignment Dec 16, 2024
@ShubhamSachdeva311205
Copy link

.take

Copy link

Thank you for looking into this issue! Please let us know if you have any questions or require any help.

@Wovchena
Copy link
Collaborator

The error says

Can not execute `setup.py` since setuptools is not available in the build environment.

Do you have setuptools installed? pip install setuptools

@ShubhamSachdeva311205
Copy link

Image yep i have the requirements satisfied

@Wovchena
Copy link
Collaborator

I don't have a good solution.

  1. Find who's dependency is thransformers-steam-generator and jstyleson and try removing it. Maybe they are not required for your task. You can also contact the authors.

  2. Try downgrading python.

  3. i tried to install dependencies from here

    From where?

  4. Is the original error message the same as in your screenshot?

@ShubhamSachdeva311205
Copy link

ShubhamSachdeva311205 commented Dec 19, 2024

i'm sorry i might come off as annoying right now but because im stuck at an issue and i dont really know how to describe my issue i might have to type out every step that i have followed

as you suggested, i tried downgrading my python version and endedup installing 3 different python versions on my mac, so it created a whole lot of other issues so i decided to delete everything and start over again,

i will list all steps that i followed so that if in case i have made any mistakes you guys can correct me

i started off by following this link

brew install coreutils scons
xcode-select --install

and then i tried to take a look at @rk119's work on :
[Good First Issue]: Verify chatglm3-6b with GenAI text_generation

this step in particular

and so after that i tried doing the same thing on my mac

git clone --recursive https://github.com/openvinotoolkit/openvino.genai.git
cd openvino.genai
git submodule update --remote --init
cd ../
curl --output ov.tgz https://storage.openvinotoolkit.org/repositories/openvino/packages/nightly/2025.0.0-17646-691385e7dde/m_openvino_toolkit_macos_12_6_2025.0.0.dev20241218_arm64.tgz
tar -xzf ov.tgz
mv m_openvino_toolkit_macos_12_6_2025.0.0.dev20241218_arm64 openvino.genai/ov
cd openvino.genai
source ov/setupvars.sh

then proceeded with

cmake -DCMAKE_BUILD_TYPE=Release -S ./ -B ./build/
cmake --install ./build/ --config Release --prefix ov
cmake --build ./build/ --config Release --target package -- -j10

(at this point im assuming my thing is built)

then i went to the readme file
and follow the steps given

pip install --upgrade-strategy eager -r ../../export-requirements.txt

optimum-cli export openvino --trust-remote-code --model TinyLlama/TinyLlama-1.1B-Chat-v1.0 TinyLlama-1.1B-Chat-v1.0

and then i follow the steps given here to build the sample

cd /Users/shubhamsachdeva/openvino.genai/ov/
source setupvars.sh
build_samples.sh
mkdir build
cd build
cmake -DCMAKE_BUILD_TYPE=Release /Users/shubhamsachdeva/openvino.genai/ov/samples/cpp

Image

@Wovchena
Copy link
Collaborator

You've built the samples two times and the third time failed. My guess is that you switched terminals between source setupvars.sh and cmake -DCMAKE_BUILD_TYPE=Release /Users/shubhamsachdeva/openvino.genai/ov/samples/cpp which resulted in error. But you can use samples from two previous successful builds.

  1. The first one was build as part of GenAI compilation:
     cmake -DCMAKE_BUILD_TYPE=Release -S ./ -B ./build/
     cmake --install ./build/ --config Release --prefix ov
     cmake --build ./build/ --config Release --target package -- -j10
    You should be able to find the built samples in /Users/shubhamsachdeva/openvino.genai/build/samples/cpp/. For example greedy_causal_lm should be here /Users/shubhamsachdeva/openvino.genai/build/samples/cpp/greedy_causal_lm/greedy_causal_lm
  2. The second samples build was triggered by build_samples.sh. But let's stick to the artefacts produced by the previous point.

@FReakYdiVi
Copy link

.take

Copy link

Thanks for being interested in this issue. It looks like this ticket is already assigned to a contributor. Please communicate with the assigned contributor to confirm the status of the issue.

@ShubhamSachdeva311205
Copy link

hey i tried doing it and it worked for tinyllama but then when i try to do it for youri it for some reason terminates the task

Image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
Status: Contributors Needed
Development

No branches or pull requests

6 participants