Skip to content

transformer: add flash attention layer #656

transformer: add flash attention layer

transformer: add flash attention layer #656

Triggered via push October 16, 2023 12:58
Status Failure
Total duration 1m 37s
Artifacts

ci.yml

on: push
Build documentation
1m 26s
Build documentation
Simulate SW on Snitch Cluster w/ Verilator
1m 17s
Simulate SW on Snitch Cluster w/ Verilator
Simulate SW on Snitch Cluster w/ Banshee
1m 14s
Simulate SW on Snitch Cluster w/ Banshee
Fit to window
Zoom out
Zoom in

Annotations

2 errors and 3 warnings
Simulate SW on Snitch Cluster w/ Banshee
Process completed with exit code 128.
Simulate SW on Snitch Cluster w/ Verilator
Process completed with exit code 128.
Simulate SW on Snitch Cluster w/ Banshee
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Simulate SW on Snitch Cluster w/ Verilator
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Build documentation
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/