Skip to content

Enable float8 attention support (q/k/v) #5641

Enable float8 attention support (q/k/v)

Enable float8 attention support (q/k/v) #5641

test (CPU 2.4, linux.4xlarge, torch==2.4.0 --index-url https://download.pytorch.org/whl/cpu, cpu)  /  linux-job

succeeded Dec 7, 2024 in 8m 33s