You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- error Error: Unsupported model type: mistral
at AutoModelForQuestionAnswering.from_pretrained (webpack-internal:///(rsc)/./node_modules/@xenova/transformers/src/models.js:3239:19)
Is there any plans to add support for it soon?
The text was updated successfully, but these errors were encountered:
Hi there! 👋 Good news: @echarlaix recently added support for mistral models to be exported with Optimum, so this is now possible (PR: huggingface/optimum#1425)! On that note, do you maybe have examples of these models which are slightly smaller in size? At the moment, we don't yet support >=7B models, so I won't be able to test without a smaller model.
Cool! Thanks for the update @xenova!
While I couldn't find an example, I found we can prune Mistral-7B to be smaller (1.3B - 3B) for local usage instead of pre-training from scratch using: https://xiamengzhou.github.io/sheared-llama/
I don't know if @xiamengzhou or @gaotianyu1350 (the creators) provided the code or if they have plans to do it for Mistral already.
Support the Mistral model
Reason for request
Context
Is there any plans to add support for it soon?
The text was updated successfully, but these errors were encountered: