Skip to content

Commit

Permalink
..
Browse files Browse the repository at this point in the history
  • Loading branch information
ShashankMosaicML committed Aug 15, 2024
1 parent 0f44dae commit de76124
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion tests/models/layers/test_attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,6 @@

from llmfoundry.models.layers.attention import (
attention_implementations,
flash_attn_fn,
scaled_multihead_dot_product_attention,
)
from llmfoundry.models.layers.layer_builders import build_attention_layer
Expand Down

0 comments on commit de76124

Please sign in to comment.