Skip to content

Latest commit

 

History

History
36 lines (23 loc) · 1.19 KB

Instructions.md

File metadata and controls

36 lines (23 loc) · 1.19 KB

Instructions

Pseudocode for MiOC

MiOC

To Start the Pretraining Process

python3 lightning_main_pretraining.py

To Continue the Pretraining Process

  • Change the pretraining from a checkpoint, you can change the Checkpoint Config with
retrain_saved_path: [str] = "" # Path to the checkpoint to be loaded
retrain_from_checkpoint: str = "load_train" 

To Start the Linear Classification Process

python3 lightning_main_lincls.py

Modifications in Code

  • We use Hydra to manage the configurations rather than argparse.
  • To make changes in the hyperparameters you can change it directly in the Pretrain Config, Linear Classifier Config, Wandb Config and Trainer Config files.
  • We use Pytorch Lightning to manage the training process.
  • This implementation with Pytorch Lightning should support ddp as well as single GPU implementation.