Skip to content

Introduce 6-bit quantization for Llama in torchchat #3992

Introduce 6-bit quantization for Llama in torchchat

Introduce 6-bit quantization for Llama in torchchat #3992

Annotations

1 warning

This job succeeded