Skip to content

Commit

Permalink
Merge branch 'add_sliding_window_attn_to_torch_attn' of github.com:Sh…
Browse files Browse the repository at this point in the history
…ashankMosaicML/llm-foundry into add_sliding_window_attn_to_torch_attn
  • Loading branch information
ShashankMosaicML committed Aug 15, 2024
2 parents a90f249 + 1ac8e31 commit 67d23f2
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion llmfoundry/models/mpt/configuration_mpt.py
Original file line number Diff line number Diff line change
Expand Up @@ -333,7 +333,7 @@ def _validate_config(self) -> None:
'attn_impl'
] == 'flash' and not is_flash_v2_installed(v2_version='v2.3.0',):
raise NotImplementedError(
'sliding window attention only implemented with for torch attention or flash attention (v2.3.0 or higher).',
'sliding window attention only implemented for torch attention and flash attention (v2.3.0 or higher).',
)
if self.embedding_fraction > 1 or self.embedding_fraction <= 0:
raise ValueError(
Expand Down

0 comments on commit 67d23f2

Please sign in to comment.