Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple Dropouts different from Original Paper and Denny Britz #44

Open
gaganchane opened this issue Apr 4, 2020 · 0 comments
Open

Comments

@gaganchane
Copy link

Thank you for sharing your code for the Keras implementation. I had a question about dropouts that are added. In the original paper, dropout is only added once after the convolution layer with a dropout of rate of 0.5, this is also true in Denny Britz's implementation. In your implementation dropout of 0.5 is added after embedding layer and drop out of 0.8 is added after the convolution layer.

Just want to confirm if this is a deviation from the above two sources, and what was the reasoning for this?

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant