Skip to content

Bumping flash attention version to 2.6.3 and adding option for softcap in attention and lm_head logits. #7822

Bumping flash attention version to 2.6.3 and adding option for softcap in attention and lm_head logits.

Bumping flash attention version to 2.6.3 and adding option for softcap in attention and lm_head logits. #7822

Annotations

2 warnings

This job succeeded