Skip to content
This repository has been archived by the owner on Aug 30, 2024. It is now read-only.

Add Fused-Attention Layer for AVX2 Platforms #66

Add Fused-Attention Layer for AVX2 Platforms

Add Fused-Attention Layer for AVX2 Platforms #66

Triggered via pull request February 22, 2024 10:00
@DDEleDDEle
synchronize #137
DDEle:mha-fp32
Status Failure
Total duration 8m 9s
Artifacts

unit-test-bestla.yml

on: pull_request
unit-test
20s
unit-test
Fit to window
Zoom out
Zoom in

Annotations

1 error and 2 warnings
unit-test
Process completed with exit code 2.
unit-test
Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20: actions/checkout@v3, actions/upload-artifact@v3. For more information see: https://github.blog/changelog/2023-09-22-github-actions-transitioning-from-node-16-to-node-20/.
unit-test
No files were found with the provided path: /home/tensorflow/actions-runner2/_work/neural-speed/neural-speed/bestla/build/unit_test*.*. No artifacts will be uploaded.