-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow accelerator to instantiate the device #5255
Conversation
Will merge #5275 first, that way we can have the newly added unit tests run on this, then we can merge it? |
@loadams Sure, thanks :) |
Done now, so the tests should run on this PR, then we can merge it, thanks! |
Halting this PR till habana version 1.16.0 will be released. CI issue should be fixed there. |
Thanks @nelyahu - when it is released, go ahead and just update it in this PR? Also curious if you can share what is required in that release that is causing failures here since it appears the tests just hang now? And just tag one of us when it is updated and we can get this merged. |
Sure @loadams , i will update. |
Thanks @nelyahu - feel free to tag us when it the new version is released and updated so we can merge this then, thanks! |
@nelyahu - I tried 1.15.1, looks like same errors, so we will wait for 1.16 |
@loadams Yes. |
@nelyahu - now that we have updated the hpu runner to 1.17, should we move ahead with merging this PR? |
known issue, WIP by habana SW team
Yes, this test is disabled locally in our env and fails for the same reason. I removed it from gaudi2 workflow file. once fixed will be re-added |
when instantiating torch.device for HPU it cannot be fed with HPU:1 annotation, but only "HPU".
moving the logic to accelerator will allow to solve this issue, with single line change.