-
Notifications
You must be signed in to change notification settings - Fork 65
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LLaVA support #478
Comments
+1, would absolutely love running llava on trn1 and inf2 |
Bump |
+1 |
It would be great to add a VLM into the supported models. |
Please check this for Llava 1.6 |
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days. |
This issue was closed because it has been stalled for 5 days with no activity. |
Feature request
The support is already present in huggingface/transformers.
But when I try to export LLaVA model to neuron format, it throws the following error:
optimum-cli export neuron --model liuhaotian/llava-v1.6-vicuna-7b --disable-validation /llava/
Motivation
I'd like to run LLaVa on AWS Inferentia.
Your contribution
I can help with testing the eventual implementation.
The text was updated successfully, but these errors were encountered: