Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Q: Incorrect neural network dimensionality? #1

Open
gil2rok opened this issue Sep 27, 2024 · 0 comments
Open

Q: Incorrect neural network dimensionality? #1

gil2rok opened this issue Sep 27, 2024 · 0 comments

Comments

@gil2rok
Copy link

gil2rok commented Sep 27, 2024

I think the input dimension of the scale and translate neural networks $s$ and $t$ may be incorrect.

The original RealNVP paper states that the dimensionality of networks $s$ and $t$ are $R^d \rightarrow R^D-d$ for some $d < D$. In this code, the data (moons and normal) is $D=2$. However, the code defines the input layers of networks $s$ and $t$ as $d=2$ instead of $d=1 < D$. I believe this is a mistake that has not been detected despite the popularity of this repo.

Screenshot from this code example:

Screenshot 2024-09-26 at 8 23 17 PM

Screenshot from the paper:

Screenshot 2024-09-26 at 8 03 25 PM

This mistake is easy to make because a later equation in the paper suggests that the input dimension of networks $s$ and $t$ should be $D$ instead of some $d < D$. In the screenshot below, the terms $s(b \cdot x)$ and $t(b \cdot x)$ are misleading because although $b \cdot x \in R^D$, we actually want to pass in only the non-masked elements of $x$ (which is in $R^d$).

Screenshot 2024-09-26 at 8 05 45 PM

If I'm making a mistake, please let me know!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant