-
Notifications
You must be signed in to change notification settings - Fork 184
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
'_flash_supports_window_size' is not defined #32
Comments
any update on this one? |
same issue :))) |
me too |
|
Preparing metadata (setup.py) ... error × python setup.py egg_info did not run successfully. When I run the command pip install flash-attn --no-build-isolation, it gives the error mentioned above. And I can't find '/usr/local/cuda/bin/nvcc' |
2024-09-27 14:47:10 | ERROR | stderr | File "anaconda3/envs/llama-omni/lib/python3.10/site-packages/transformers/modeling_flash_attention_utils.py", line 180, in _flash_attention_forward
2024-09-27 14:47:10 | ERROR | stderr | _flash_supports_window_size and sliding_window is not None and key_states.shape[1] > sliding_window
2024-09-27 14:47:10 | ERROR | stderr | NameError: name '_flash_supports_window_size' is not defined
transformers 4.43.4
The text was updated successfully, but these errors were encountered: