Skip to content

refact quantization, support torchao quant and vllm w8a8(int/fp), support mix quantization. #306

refact quantization, support torchao quant and vllm w8a8(int/fp), support mix quantization.

refact quantization, support torchao quant and vllm w8a8(int/fp), support mix quantization. #306

Annotations

2 warnings

pre-commit

succeeded Nov 20, 2024 in 19s