Skip to content

Introduce 6-bit quantization for Llama in torchchat #116

Introduce 6-bit quantization for Llama in torchchat

Introduce 6-bit quantization for Llama in torchchat #116

Annotations

2 warnings

This job succeeded