Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incorrect formular for "Diachronic Word Embeddings Reveal Statistical Laws of Semantic Change" paper? #8

Open
mapa17 opened this issue Jun 25, 2021 · 0 comments

Comments

@mapa17
Copy link

mapa17 commented Jun 25, 2021

Could it be that the rotation matrix described in the paper summary of "Diachronic Word Embeddings Reveal Statistical Laws of Semantic Change", in the embedding chapter of the NLP course, is incorrect?

It states that the rotation would be the maximize the distance between the two projected vectors. Having a look at the paper looking at equation 4, i think it should be a minimisation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant