Skip to content

Commit

Permalink
Fix inplace loss computation
Browse files Browse the repository at this point in the history
  • Loading branch information
qgallouedec committed Dec 25, 2024
1 parent 24c91f0 commit 992fac6
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/transformers/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -3700,7 +3700,7 @@ def training_step(
else:
# Finally we need to normalize the loss for reporting
if num_items_in_batch is None:
loss /= self.args.gradient_accumulation_steps
loss = loss / self.args.gradient_accumulation_steps

self.accelerator.backward(loss, **kwargs)

Expand Down

0 comments on commit 992fac6

Please sign in to comment.