forked from jpata/particleflow
-
Notifications
You must be signed in to change notification settings - Fork 2
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Learning rate schedules and Mamba layer (jpata#282)
* fix: update parameter files * fix: better comet-ml logging * update flatiron Ray Train submissions scripts * update sbatch script * log overridden config to comet-ml instead of original * fix: checkpoint loading specify full path to checkpoint using --load-cehckpoint * feat: implement LR schedules in the PyTorch training code * update sbatch scripts * feat: LR schedules support checkpointing and resuming training * update sbatch scripts * update ray tune search space * fix: dropout parameter not taking effect on torch gnn-lsh model * make more gnn-lsh parameters confgiurable * make activation function configurable * update raytune search space * feat: add MambaLayer * update raytune search space * update pyg-cms.yaml * fix loading of checkpoint in testing with raytrain based run
- Loading branch information
Showing
21 changed files
with
558 additions
and
124 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.