Skip to content

Precompute flash attention padding info #4064

Precompute flash attention padding info

Precompute flash attention padding info #4064

Triggered via pull request January 17, 2024 18:18
@ShashankMosaicMLShashankMosaicML
synchronize #880
Status Cancelled
Total duration 2m 5s
Artifacts

pr-gpu.yaml

on: pull_request_target
Matrix: pytest-gpu
Waiting for pending jobs
Fit to window
Zoom out
Zoom in

Annotations

1 error
PR GPU tests
Canceling since a higher priority waiting request for 'PR GPU tests-880' exists