Skip to content

Bumping flash attention version to 2.6.3 and adding option for softcap in attention and lm_head logits. #7828

Bumping flash attention version to 2.6.3 and adding option for softcap in attention and lm_head logits.

Bumping flash attention version to 2.6.3 and adding option for softcap in attention and lm_head logits. #7828

Annotations

1 warning

This job succeeded