-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error "Failed to buffer the request body: length limit exceeded" when supplying base64 encoded images greater than 1MB in prompt #1802
Comments
Likely related to this #1777 |
I've also encountered that problem, but the length limit exceeded thing also happens on the idefics-9b-instruct model. That model works with images of varying dimensionality, but still fails when the image is large (over 1MB). |
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days. |
I will revalidate on the latest TGI version shortly. |
I tried this again with the latest version and the idefics-8b-chatty model instead of the llava model and the issue persists.
model info
|
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days. |
I tried to replicate this on the latest TGI version (2.2) and ended up with a different error:
|
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days. |
Still experiencing the issue. |
Also experiencing this issue when running with this model. |
The addition of the |
System Info
text-generation-launcher --env
model info
Information
Tasks
Reproduction
Use an image that is greater than 1MB, set IMAGE_PATH and API_ENDPOINT appropriately:
this will print :
Failed to buffer the request body: length limit exceeded
If using an image less than 1MB, it generates correctly.
Expected behavior
It should generate text for the image as long as it fits within the model's context. Based on the text of the error, it looks like it has something to do with the default body size in Axum based on the similarity to tokio-rs/axum#1652.
The text was updated successfully, but these errors were encountered: