We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The code is not yet available to work on Jetpack 4.4.
The current version requires PYTorch 1.4.0 to be installed, altough the Jetpack 4.4 production dont support Pytorch < 1.6.0.
Anybody using Jetpack 4.4 are currently in a loop hole. Follows the error (PyTorch Incompatibility)
Does anybody has a solution to this? Will we have an implementation of TRT for Pytorch 1.6?
torch.nn.modules.module.ModuleAttributeError: 'BatchNorm2d' object has no attribute '_non_persistent_buffers_set'
The text was updated successfully, but these errors were encountered:
Updated with tag v3.0
Sorry, something went wrong.
No branches or pull requests
The code is not yet available to work on Jetpack 4.4.
The current version requires PYTorch 1.4.0 to be installed, altough the Jetpack 4.4 production dont support Pytorch < 1.6.0.
Anybody using Jetpack 4.4 are currently in a loop hole. Follows the error (PyTorch Incompatibility)
Does anybody has a solution to this? Will we have an implementation of TRT for Pytorch 1.6?
torch.nn.modules.module.ModuleAttributeError: 'BatchNorm2d' object has no attribute '_non_persistent_buffers_set'
The text was updated successfully, but these errors were encountered: