You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Dear Author, I have a naive question, which parameter controls the number of iterations when training the embeddings? I learned from your paper that 5k is generally sufficient but I would like to do some experimenting myself.
I thought it is model.params.timesteps. I tried to change to 5, but the training did not stop after 5 epochs.
Any help would be greatly appreciated. Thank you!
The text was updated successfully, but these errors were encountered:
I think figured it out. It is more about how to use PyTorch-lightning trainer. I think the parameter to control it is in
trainer:
- max_steps: <# of steps I want>
Not sure about the difference between "step" vs "epoch". It seems like a "step" means a batch in one epoch? Anyway, thank you for maintaining the repository and I will keep learning :)
Dear Author, I have a naive question, which parameter controls the number of iterations when training the embeddings? I learned from your paper that 5k is generally sufficient but I would like to do some experimenting myself.
I thought it is
model.params.timesteps
. I tried to change to 5, but the training did not stop after 5 epochs.Any help would be greatly appreciated. Thank you!
The text was updated successfully, but these errors were encountered: