Skip to content

Commit

Permalink
changes for docstrings
Browse files Browse the repository at this point in the history
  • Loading branch information
gupta-abhay committed Aug 13, 2024
1 parent 95f8497 commit 9fc85d6
Showing 1 changed file with 6 additions and 3 deletions.
9 changes: 6 additions & 3 deletions llmfoundry/models/hf/hf_causal_lm.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@
AutoConfig,
AutoModelForCausalLM,
GenerationConfig,
PretrainedConfig,
PreTrainedModel,
PreTrainedTokenizerBase,
)
Expand Down Expand Up @@ -194,8 +195,8 @@ def build_inner_model(
config_overrides: Dict[str, Any],
load_in_8bit: bool,
pretrained: bool,
config_fn: Optional[Callable] = AutoConfig,
model_fn: Optional[Callable] = AutoModelForCausalLM,
config_fn: Optional[PretrainedConfig] = AutoConfig,
model_fn: Optional[PreTrainedModel] = AutoModelForCausalLM,
prepare_for_fsdp: bool = False,
) -> Union[PreTrainedModel, 'PeftModel']:
"""Builds the inner model for the ComposerHFCausalLM.
Expand All @@ -210,7 +211,9 @@ def build_inner_model(
config_overrides (Dict[str, Any]): The configuration overrides.
load_in_8bit (bool): Whether to load in 8-bit.
pretrained (bool): Whether the model is pretrained.
prepare_for_fsdp (bool, optional): Whether to prepare the model for FSDP wrapping. Default: False.
config_fn (PretrainedConfig): HF class for configs. Default: ``AutoConfig``.
model_fn (PreTrainedModel): HF class for models. Default: ``AutoModelForCausalLM``.
prepare_for_fsdp (bool, optional): Whether to prepare the model for FSDP wrapping. Default: ``False``.
Returns:
Union[PreTrainedModel, 'PeftModel']: The built inner model.
Expand Down

0 comments on commit 9fc85d6

Please sign in to comment.