You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After training a text classifier with Flair in Colab, I have followed this guide to successfully create onnx embeddings and to quantize my model. The final step saves this model, however it isn't an actual ONNX model, it remains a pytorch model.
Is there a way to correctly export or save the model in the ONNX format? It seems if I attempt to, then the file is unable to be loaded by ONNX.
The new model works in Flair but the aim here is to run this in production via ONNX runtime. As that's why we use ONNX right? ;)
Question
Hi,
After training a text classifier with Flair in Colab, I have followed this guide to successfully create onnx embeddings and to quantize my model. The final step saves this model, however it isn't an actual ONNX model, it remains a pytorch model.
Is there a way to correctly export or save the model in the ONNX format? It seems if I attempt to, then the file is unable to be loaded by ONNX.
The new model works in Flair but the aim here is to run this in production via ONNX runtime. As that's why we use ONNX right? ;)
Following the guide I ran:
E.g. I attempt to load with onnx by:
It errors with:
Google Colab, Python 3.10.12
Versions:
The text was updated successfully, but these errors were encountered: