Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Learning Cost is become stagnant after 300 iteration (epoch) #8

Open
dipanjan06 opened this issue Aug 5, 2016 · 0 comments
Open

Learning Cost is become stagnant after 300 iteration (epoch) #8

dipanjan06 opened this issue Aug 5, 2016 · 0 comments

Comments

@dipanjan06
Copy link

Hello,

First of all thank you for publishing this great code base. I have trained model upto 300 epoch but it looks like the learning cost is not reduced much and stayed around 1.5 to 1.6 . Did you land up this kind of situation . I am running this code on my laptop which has NVIDIA GPU (920m ) and took long time to complete a single epoch ( 100 epoch took near about 40 hours) . Need your suggestion whether should I continue learning until 1000 epoch or any trick would required to process further iteration .
Please note that after 300 iteration model is still not able generate proper caption from any random image( Facebook, Instagram etc) . I believe this error should reduced further to get a proper model.

Thanks in advance.

Regards,
Dipanjan

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant