Skip to content

Commit

Permalink
Update src/transformers/models/llama/modeling_llama.py
Browse files Browse the repository at this point in the history
Co-authored-by: Arthur <[email protected]>
  • Loading branch information
nbroad1881 and ArthurZucker authored Sep 23, 2024
1 parent 633f436 commit dd94152
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions src/transformers/models/llama/modeling_llama.py
Original file line number Diff line number Diff line change
Expand Up @@ -490,8 +490,8 @@ def forward(
value_states = self.v_proj(hidden_states)

# Flash attention requires the input to have the shape
# batch_size x seq_length x num_heads x head_dim
# but rotary embeddings require batch_size x num_heads x seq_length x head_dim
# batch_size, seq_length, num_heads, head_dim
# but rotary embeddings require batch_size, num_heads, seq_length, head_dim
query_states = query_states.view(bsz, q_len, self.num_heads, self.head_dim).transpose(1, 2)
key_states = key_states.view(bsz, q_len, self.num_key_value_heads, self.head_dim).transpose(1, 2)
value_states = value_states.view(bsz, q_len, self.num_key_value_heads, self.head_dim).transpose(1, 2)
Expand Down

0 comments on commit dd94152

Please sign in to comment.