Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Config Parameter to Control the number of iterations? #146

Open
2454511550Lin opened this issue Apr 3, 2023 · 1 comment
Open

Config Parameter to Control the number of iterations? #146

2454511550Lin opened this issue Apr 3, 2023 · 1 comment

Comments

@2454511550Lin
Copy link

Dear Author, I have a naive question, which parameter controls the number of iterations when training the embeddings? I learned from your paper that 5k is generally sufficient but I would like to do some experimenting myself.

I thought it is model.params.timesteps. I tried to change to 5, but the training did not stop after 5 epochs.

Any help would be greatly appreciated. Thank you!

@2454511550Lin
Copy link
Author

I think figured it out. It is more about how to use PyTorch-lightning trainer. I think the parameter to control it is in

trainer:
  - max_steps: <# of steps I want>

Not sure about the difference between "step" vs "epoch". It seems like a "step" means a batch in one epoch? Anyway, thank you for maintaining the repository and I will keep learning :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant