Skip to content

refact quantization, support torchao quant and vllm w8a8(int/fp), support mix quantization. #324

refact quantization, support torchao quant and vllm w8a8(int/fp), support mix quantization.

refact quantization, support torchao quant and vllm w8a8(int/fp), support mix quantization. #324

Annotations

2 warnings

pre-commit

succeeded Nov 22, 2024 in 22s