Skip to content

Introduce 6-bit quantization for Llama in torchchat #635

Introduce 6-bit quantization for Llama in torchchat

Introduce 6-bit quantization for Llama in torchchat #635

Annotations

2 warnings

This job succeeded