Skip to content

Commit

Permalink
remove more and more in modular_diffllama.pt
Browse files Browse the repository at this point in the history
  • Loading branch information
weak-kajuma committed Dec 6, 2024
1 parent 3f85c22 commit 87d034d
Showing 1 changed file with 0 additions and 7 deletions.
7 changes: 0 additions & 7 deletions src/transformers/models/diffllama/modular_diffllama.py
Original file line number Diff line number Diff line change
Expand Up @@ -429,13 +429,6 @@ def forward(
return attn_output, None, past_key_value


# DIFFLLAMA_ATTENTION_CLASSES = {
# "eager": DiffLlamaAttention,
# "flash_attention_2": DiffLlamaFlashAttention2,
# "sdpa": DiffLlamaSdpaAttention,
# }


class DiffLlamaDecoderLayer(LlamaDecoderLayer):
pass

Expand Down

0 comments on commit 87d034d

Please sign in to comment.