Skip to content

Commit

Permalink
Update src/transformers/training_args.py
Browse files Browse the repository at this point in the history
Co-authored-by: amyeroberts <[email protected]>
  • Loading branch information
helloworld1 and amyeroberts committed Apr 19, 2024
1 parent 3dddc8c commit 48b2fc4
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/transformers/training_args.py
Original file line number Diff line number Diff line change
Expand Up @@ -515,7 +515,7 @@ class TrainingArguments:
ensure they are the same across all ranks after initialization
- cpu_ram_efficient_loading (`bool`, *optional*, defaults to `False`)
If `True`, only the first process loads the pretrained model checkpoint while all other processes
have empty weights. When this setting is `True`, `sync_module_states` also must to be `True`,
have empty weights. When this setting as `True`, `sync_module_states` also must to be `True`,
otherwise all the processes except the main process would have random weights leading to unexpected
behaviour during training.
- activation_checkpointing (`bool`, *optional*, defaults to `False`):
Expand Down

0 comments on commit 48b2fc4

Please sign in to comment.