Skip to content

Commit

Permalink
add docstrings for extra args
Browse files Browse the repository at this point in the history
Signed-off-by: Sukriti-Sharma4 <[email protected]>
  • Loading branch information
Ssukriti committed Dec 20, 2024
1 parent 46d7b89 commit abe82db
Show file tree
Hide file tree
Showing 2 changed files with 12 additions and 0 deletions.
6 changes: 6 additions & 0 deletions src/transformers/models/bamba/modeling_bamba.py
Original file line number Diff line number Diff line change
Expand Up @@ -1206,6 +1206,12 @@ def forward(
cache_position: Optional[torch.LongTensor] = None,
**kwargs,
) -> Union[Tuple, BaseModelOutputWithPast]:
"""
Args:
kwargs (`dict`, *optional*):
Arbitrary kwargs that are passed and to be ignored. In the future support for
FlashAttentionKwargs will be added.
"""
output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions
output_hidden_states = (
output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states
Expand Down
6 changes: 6 additions & 0 deletions src/transformers/models/bamba/modular_bamba.py
Original file line number Diff line number Diff line change
Expand Up @@ -949,6 +949,12 @@ def forward(
cache_position: Optional[torch.LongTensor] = None,
**kwargs,
) -> Union[Tuple, BaseModelOutputWithPast]:
"""
Args:
kwargs (`dict`, *optional*):
Arbitrary kwargs that are passed and to be ignored. In the future support for
FlashAttentionKwargs will be added.
"""
output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions
output_hidden_states = (
output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states
Expand Down

0 comments on commit abe82db

Please sign in to comment.