Skip to content

Introduce 6-bit quantization for Llama in torchchat #635

Introduce 6-bit quantization for Llama in torchchat

Introduce 6-bit quantization for Llama in torchchat #635

Annotations

3 warnings

This job succeeded