Skip to content

Commit

Permalink
Fix low_cpu_mem_usage Flag Conflict with DeepSpeed Zero 3 in `from_…
Browse files Browse the repository at this point in the history
…pretrained` for Models with `keep_in_fp32_modules`" (huggingface#27762)

Fix `from_pretrained` Logic
for `low_cpu_mem_usage` with DeepSpeed Zero3
  • Loading branch information
kotarotanahashi authored and staghado committed Jan 15, 2024
1 parent 3a9373a commit aa209ad
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/transformers/modeling_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -3466,7 +3466,7 @@ def from_pretrained(

# Check first if we are `from_pt`
if use_keep_in_fp32_modules:
if is_accelerate_available():
if is_accelerate_available() and not is_deepspeed_zero3_enabled():
low_cpu_mem_usage = True
keep_in_fp32_modules = model._keep_in_fp32_modules
else:
Expand Down

0 comments on commit aa209ad

Please sign in to comment.