Skip to content

Bumping flash attention version to 2.6.3 and adding option for softcap in attention and lm_head logits. #7295

Bumping flash attention version to 2.6.3 and adding option for softcap in attention and lm_head logits.

Bumping flash attention version to 2.6.3 and adding option for softcap in attention and lm_head logits. #7295

Annotations

1 warning

This job succeeded