-
Notifications
You must be signed in to change notification settings - Fork 287
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I-JEPA - contributing to lib #1260
Comments
@guarin Hello! Can I implement additional module with Attention and Causal Attention (like in x-formers) for usage in this model (Causal Attention for Predictors). |
Sorry for interruption, looks like figured out how to do it with attention mask in mha in PyTorch. So there is no need to add module |
#1273 Is starting point to this issue |
Completed with experimental #1273 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
In this issue, I have initiated the beginning of a PR to implement I-JEPA. If there are any concerns or suggestions regarding potential issues or best practices, please kindly share them.
The text was updated successfully, but these errors were encountered: