Skip to content

Commit

Permalink
Reduce GPU memory usage when using FSDP+PEFT (#28830)
Browse files Browse the repository at this point in the history
support FSDP+PEFT
  • Loading branch information
pacman100 authored Feb 2, 2024
1 parent f497795 commit 80d5007
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions src/transformers/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -1697,6 +1697,8 @@ def _inner_training_loop(
use_accelerator_prepare = True if model is self.model else False

if delay_optimizer_creation:
if use_accelerator_prepare:
self.model = self.accelerator.prepare(self.model)
self.create_optimizer_and_scheduler(num_training_steps=max_steps)

# prepare using `accelerator` prepare
Expand Down

0 comments on commit 80d5007

Please sign in to comment.