Skip to content

Precompute flash attention padding info #4080

Precompute flash attention padding info

Precompute flash attention padding info #4080

Triggered via pull request January 18, 2024 00:08
@ShashankMosaicMLShashankMosaicML
synchronize #880
Status Success
Total duration 22m 32s
Artifacts

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu
Fit to window
Zoom out
Zoom in