Skip to content

Commit

Permalink
Update FA2 exception msg to point to hub discussions (#28161)
Browse files Browse the repository at this point in the history
* Update FA2 exception msg to point to hub discussions

* Use path for hub url
  • Loading branch information
amyeroberts authored Dec 20, 2023
1 parent 9924df9 commit 224ab70
Showing 1 changed file with 3 additions and 2 deletions.
5 changes: 3 additions & 2 deletions src/transformers/modeling_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -1380,8 +1380,9 @@ def _check_and_enable_flash_attn_2(
"""
if not cls._supports_flash_attn_2:
raise ValueError(
f"{cls.__name__} does not support Flash Attention 2.0 yet. Please open an issue on GitHub to "
"request support for this architecture: https://github.com/huggingface/transformers/issues/new"
f"{cls.__name__} does not support Flash Attention 2.0 yet. Please request to add support where"
f" the model is hosted, on its model hub page: https://huggingface.co/{config._name_or_path}/discussions/new"
" or in the Transformers GitHub repo: https://github.com/huggingface/transformers/issues/new"
)

if not is_flash_attn_2_available():
Expand Down

0 comments on commit 224ab70

Please sign in to comment.