You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, thank you for your interest. In your email, you said that you got the error RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). when running code.
May I ask if you have done any modification to my code since this error often happens if you forget to detach the previous output from gradient computational graph in this line.
Do you use pytorch lightning for training? If yes, the loss.backward() will be handled automatically by the library. Putting this line into training code will cause error
你好,loss不能做backward(),如何解决。
The text was updated successfully, but these errors were encountered: