Skip to content

Commit

Permalink
..
Browse files Browse the repository at this point in the history
  • Loading branch information
ShashankMosaicML committed Aug 15, 2024
1 parent 83cbd98 commit 409ed15
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion llmfoundry/models/layers/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -202,7 +202,6 @@ def scaled_multihead_dot_product_attention(
window_mask,
diagonal=-sliding_window_size,
)
window_mask = window_mask[-s_q:, -s_k:]
window_mask = ~window_mask
attn_weight = attn_weight.masked_fill(
window_mask.view(1, 1, s_q, s_k),
Expand Down

0 comments on commit 409ed15

Please sign in to comment.