Skip to content

refact quantization, support torchao quant and vllm w8a8(int/fp), support mix quantization. #305

refact quantization, support torchao quant and vllm w8a8(int/fp), support mix quantization.

refact quantization, support torchao quant and vllm w8a8(int/fp), support mix quantization. #305

Annotations

2 warnings

pre-commit

succeeded Nov 20, 2024 in 20s