You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have recently started seeing pretrained models on the huggingface hub in model.safetensors files in addition to the (less secure) pickled pytorch_model.bin. The transformers PreTrainedModel.from_pretrained has a use_safetensors argument that controls whether these files should be used.
Does sentence-transformers have something equivalent and/or default to using them where available?
The text was updated successfully, but these errors were encountered:
Yes, model.safetensors files are automatically prioritized over pytorch_model.bin. However, currently the full HF Hub repository is downloaded, often resulting in a download for both model.safetensorsandpytorch_model.bin, but #2345 will resolve this inefficiency.
I have recently started seeing pretrained models on the huggingface hub in model.safetensors files in addition to the (less secure) pickled
pytorch_model.bin
. The transformers PreTrainedModel.from_pretrained has ause_safetensors
argument that controls whether these files should be used.Does
sentence-transformers
have something equivalent and/or default to using them where available?The text was updated successfully, but these errors were encountered: