-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Loss goes to 0 #4
Comments
Hi, It's works for me with latest version of PyTorch. Which version of PyTorch do you use? Best regards, |
Hi, I use pytorch 0.4 and python 3.6. Is the idea correct to first apply some loss_function and the use the maxpooling? Cheers, |
This code works fine for me:
|
Hmm, still can't make it work... Tried a few more values for p and ratio, but didn't change anything. Any ideas on how I could try to debug this? Intermediate values or gradients or something? I only have 5 classes in the moment, the background is dominating by far (about 86 % percent averaged over the training set), could that be a reason why it doesnt work for me? |
You can try to visualize loss before and after mpl and check that it works correct.
|
Hi,
first of all thanks for publishing the source code,
I want to try this new method in a segmentation problem I have, to avoid having to weigh the classes by hand.
When I try it, the loss approaches 0 very fast, going down to less than 2e-7 in a single epoch. I use it as follows:
When I just use crossentropy, the model trains fine.
Any ideas on what might be causing this?
Cheers,
Johannes
The text was updated successfully, but these errors were encountered: