Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix typo in reagent_lightning_module.py #712

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion reagent/training/reagent_lightning_module.py
Original file line number Diff line number Diff line change
Expand Up @@ -204,7 +204,7 @@ class StoppingEpochCallback(pl.Callback):
We use this callback to control the number of training epochs in incremental
training. Epoch & step counts are not reset in the checkpoint. If we were to set
`max_epochs` on the trainer, we would have to keep track of the previous `max_epochs`
and add to it manually. This keeps the infomation in one place.
and add to it manually. This keeps the information in one place.

Note that we need to set `_cleanly_stopped` back to True before saving the checkpoint.
This is done in `ModelManager.save_trainer()`.
Expand Down