-
Notifications
You must be signed in to change notification settings - Fork 27.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🚨🚨🚨 fix(Mask2Former): torch export 🚨🚨🚨 #34393
🚨🚨🚨 fix(Mask2Former): torch export 🚨🚨🚨 #34393
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me, thanks for working on this! I just have a question regarding one of your changes
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
Hey @yonigozlan if you have a chance, I would love another review of this PR after addressing your comments! |
Looks great to me thanks again for fixing this! I think we should add a short comment just above the definition of |
cfaef32
to
7a460d4
Compare
Thanks Yoni! Pinging @amyeroberts and @qubvel for a core maintainer review! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for working on this!
It would be also nice to have a test that export works, maybe smth similar to this.
4c78eca
to
78f3848
Compare
Signed-off-by: Phillip Kuznetsov <[email protected]>
Signed-off-by: Phillip Kuznetsov <[email protected]>
Signed-off-by: Phillip Kuznetsov <[email protected]>
Signed-off-by: Phillip Kuznetsov <[email protected]>
Signed-off-by: Phillip Kuznetsov <[email protected]>
Signed-off-by: Phillip Kuznetsov <[email protected]>
Signed-off-by: Phillip Kuznetsov <[email protected]>
3b3562f
to
104b16d
Compare
Added an export test and addressed your other comments. Please let me know if you would like me to modify anything else! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for a quick iteration! A few more comments.
Please, push empty commit with message[run_slow] mask2former
at the end to trigger all slow tests run for mask2former model
You got it! I think I need maintainer approval to run the slow workflows |
Signed-off-by: Phillip Kuznetsov <[email protected]>
Signed-off-by: Phillip Kuznetsov <[email protected]>
38feb0d
to
704b79f
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am not 100% we have to break not, do you think we can keep supporting the old behaviour / at least have a deprecation cycle?
@@ -926,7 +926,7 @@ def forward( | |||
encoder_attention_mask=None, | |||
position_embeddings: Optional[torch.Tensor] = None, | |||
reference_points=None, | |||
spatial_shapes=None, | |||
spatial_shapes_list=None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
changing the name is a breaking change, we should probably have a deprecation cycle no?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if not, we can add 🔴 as I think the motivation is strong enough.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, I suppose it is an internal module of the model, not sure if it is intended to be used elsewhere, let me know if I'm wrong
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Isn't changing the logic, but keeping both spatial_shapes
and spatial_shapes_list
already a breaking change?
Sure there's some question of whether users can rely on internal Modules of transformers models, but also if a user doesn't pass a value for spatial_shapes_list
this code will fail as L939 will try to iterate over a None object.
BTW seems like the changes in #33600 already violate this contract(see this line)? I followed that PR as a guide on what I should change here.
It seems like the proper way forward is to add a 🔴 here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the path forward @ArthurZucker ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bumping this again after the weekend. I'm not familiar with Huggingface's policies of
- What qualifies as a breaking change?
- What the release process is?
Could you provide a recommendation on where we should take this PR?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Usually even if a module is Internal
people still end up using it 😅
This IS a breaking change, but acceptable IMO. Let's use some 🚨 on the PR title to make sure we communicate about it on the release!
soft ping @ArthurZucker, regarding your last comment, I'm not sure it can be done without breaking changes, but as soon as it is the internal module of the model I suppose it is fine to change its signature. 🚨 can be added to PR message if it is still required |
Hi @ArthurZucker @qubvel, Hope you're doing well, could you find some time today or tomorrow to provide guidance on this PR? Would love to check it off of my list! |
Hi @ArthurZucker @qubvel any chance we can move forward with this PR this week? Happy to do whatever you would like, just need to get guidance on what you would like to do here. |
Hi @philkuz, sorry for the delay, the team was on the offsite this week. I will ping Arthur to get it reviewed and merged. Thanks for the patience |
Thank you, Pavel! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this contribution! 🤗 Looks good will just update the PR name!
@@ -926,7 +926,7 @@ def forward( | |||
encoder_attention_mask=None, | |||
position_embeddings: Optional[torch.Tensor] = None, | |||
reference_points=None, | |||
spatial_shapes=None, | |||
spatial_shapes_list=None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Usually even if a module is Internal
people still end up using it 😅
This IS a breaking change, but acceptable IMO. Let's use some 🚨 on the PR title to make sure we communicate about it on the release!
cc @guangy10 |
@philkuz Awesome! Thanks for expanding the export coverage to more models! 🚀 🚀 🚀 |
* fix(Mask2Former): torch export Signed-off-by: Phillip Kuznetsov <[email protected]> * revert level_start_index and create a level_start_index_list Signed-off-by: Phillip Kuznetsov <[email protected]> * Add a comment to explain the level_start_index_list Signed-off-by: Phillip Kuznetsov <[email protected]> * Address comment Signed-off-by: Phillip Kuznetsov <[email protected]> * add torch.export.export test Signed-off-by: Phillip Kuznetsov <[email protected]> * rename arg Signed-off-by: Phillip Kuznetsov <[email protected]> * remove spatial_shapes Signed-off-by: Phillip Kuznetsov <[email protected]> * Use the version check from pytorch_utils Signed-off-by: Phillip Kuznetsov <[email protected]> * [run_slow] mask2former Signed-off-by: Phillip Kuznetsov <[email protected]> --------- Signed-off-by: Phillip Kuznetsov <[email protected]>
* fix(Mask2Former): torch export Signed-off-by: Phillip Kuznetsov <[email protected]> * revert level_start_index and create a level_start_index_list Signed-off-by: Phillip Kuznetsov <[email protected]> * Add a comment to explain the level_start_index_list Signed-off-by: Phillip Kuznetsov <[email protected]> * Address comment Signed-off-by: Phillip Kuznetsov <[email protected]> * add torch.export.export test Signed-off-by: Phillip Kuznetsov <[email protected]> * rename arg Signed-off-by: Phillip Kuznetsov <[email protected]> * remove spatial_shapes Signed-off-by: Phillip Kuznetsov <[email protected]> * Use the version check from pytorch_utils Signed-off-by: Phillip Kuznetsov <[email protected]> * [run_slow] mask2former Signed-off-by: Phillip Kuznetsov <[email protected]> --------- Signed-off-by: Phillip Kuznetsov <[email protected]>
What does this PR do?
Fixes #34390 (issue)
Mask2Former modeling had a set of issues that prevented
torch.export
from working. This PR addresses ~3 individual problems that prevented the model from working. I took direction from a few PRsthat made similar changes:
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
@amyeroberts, @qubvel, @ylacombe (tagging because you looked at #34023)
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
Testing:
Run this once in
main
and then once with this branch. Ensure the testing.all_close() works.I also visually compared the output masks and they look the same