Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

loss.backward()无法进行反向传播 #1

Open
feixuedudiao opened this issue Feb 14, 2023 · 3 comments
Open

loss.backward()无法进行反向传播 #1

feixuedudiao opened this issue Feb 14, 2023 · 3 comments

Comments

@feixuedudiao
Copy link

你好,loss不能做backward(),如何解决。

@huent189
Copy link
Collaborator

Hi, thank you for your interest. In your email, you said that you got the error RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). when running code.
May I ask if you have done any modification to my code since this error often happens if you forget to detach the previous output from gradient computational graph in this line.

@feixuedudiao
Copy link
Author

Thank you to answer my.I modify the loss.backward() for the loss after.
image
Is error to do this?

@huent189
Copy link
Collaborator

huent189 commented Feb 15, 2023

Do you use pytorch lightning for training? If yes, the loss.backward() will be handled automatically by the library. Putting this line into training code will cause error

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants