-
Notifications
You must be signed in to change notification settings - Fork 120
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unexpected shape
issue in Hessian-Vector computation
#8
Comments
Hi, I have faced the same issue. Have you solved this issue? |
@liujingcs Ah! It has been a long while. I think I upgraded the PyTorch version (probably If nothing works, you may want to check out this work by the same group: https://github.com/noahgolmant/pytorch-hessian-eigenthings |
Hi, I wonder if you have solved this issue. Thanks so much. |
Hi guys, I meet the same issue and just figure it out. In my case, it's because there are some layers that have been defined in the model but didn't participate in forward or backward propagation. The issue was fixed after I delete the unused layers. |
Hi!
Thank you making the source code of your work available. I tried to use the library for an application involving a 3D network architecture, and ran into the following issue:
Interestingly, the issue does not occur at the first call to back-propagation via
loss.backward()
, rather occurs at the call totorch.autograd.grad()
.I believe that the
float
object in question is the0.
manually inserted whenparam.grad is None
in the following routine:PyHessian/pyhessian/utils.py
Lines 61 to 72 in c2e49d2
If I am right, it is even more mind-boggling that a type(I mistakenly mixedfloat
is able to pass the check for data-type in PyTorchoutputs
andinputs
arguments oftorch.autograd.grad
). Kindly guide about what I can do here.P.S.
hessian_analysis.py
is a wrapper I wrote around the library, for my use-case. I verified the wrapper by running a 2-layer neural network for a regression task.The text was updated successfully, but these errors were encountered: