Skip to content

Precompute flash attention padding info #4065

Precompute flash attention padding info

Precompute flash attention padding info #4065

Triggered via pull request January 17, 2024 18:20
@ShashankMosaicMLShashankMosaicML
synchronize #880
Status Cancelled
Total duration 12m 15s
Artifacts

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu
Fit to window
Zoom out
Zoom in

Annotations

3 errors
gpu-2.1.0-flash2 / pytest-gpu
Process completed with exit code 1.
gpu-2.1.0 / pytest-gpu
FailFast: cancelling since parallel instance has failed
gpu-2.1.0 / pytest-gpu
The operation was canceled.