Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loss Function Compute #38

Open
zhangyanide opened this issue May 9, 2022 · 1 comment
Open

Loss Function Compute #38

zhangyanide opened this issue May 9, 2022 · 1 comment

Comments

@zhangyanide
Copy link

In the definition of Gaussian2DLikelihood, you calculate the density function, when the result of density function >1 , result = -torch.log(torch.clamp(result, min=epsilon)), this value will <0, the loss < 0. I think the probability value is between 0-1, and the cross entropy should be > 0. Is it right, look forward your reply

@llllys
Copy link

llllys commented Jul 28, 2022

计算连续变量的log-loss这件事就很让人疑惑,但原文里也是这个意思,不太理解...而且概率密度函数中一个点的值并没有什么意义

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants