Skip to content

Commit

Permalink
Bump flash-attn from 2.6.3 to 2.7.2.post1
Browse files Browse the repository at this point in the history
Bumps [flash-attn](https://github.com/Dao-AILab/flash-attention) from 2.6.3 to 2.7.2.post1.
- [Release notes](https://github.com/Dao-AILab/flash-attention/releases)
- [Commits](Dao-AILab/flash-attention@v2.6.3...v2.7.2.post1)

---
updated-dependencies:
- dependency-name: flash-attn
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <[email protected]>
  • Loading branch information
dependabot[bot] authored Dec 9, 2024
1 parent 7b8bf5f commit 4dbe9a2
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@

# Flash 2 group kept for backwards compatibility
extra_deps['gpu-flash2'] = [
'flash-attn==2.6.3',
'flash-attn==2.7.2.post1',
]

extra_deps['gpu'] = copy.deepcopy(extra_deps['gpu-flash2'])
Expand Down

0 comments on commit 4dbe9a2

Please sign in to comment.