Skip to content

refact quantization, support torchao quant and vllm w8a8(int/fp), support mix quantization. #323

refact quantization, support torchao quant and vllm w8a8(int/fp), support mix quantization.

refact quantization, support torchao quant and vllm w8a8(int/fp), support mix quantization. #323

Annotations

2 warnings

pre-commit

succeeded Nov 22, 2024 in 23s