nemo_automodel.components.distributed.pipelining.hf_utils#

Module Contents#

Functions#

get_text_module

Return the nested text/LLM module if present, else the model itself.

create_pipeline_forward_inner

create_pipeline_forward_causal_lm

patch_hf_model_for_pp

Patch a HF model/module to produce pipeline-compatible forward.

init_hf_model_buffers

validate_hf_model_for_pipeline_support

Validate if a model is compatible with torch.distributed.pipelining.

Data#

API#

nemo_automodel.components.distributed.pipelining.hf_utils.logger#

‘getLogger(…)’

nemo_automodel.components.distributed.pipelining.hf_utils.TEXT_MODULE_ATTRS#

(‘language_model’, ‘text_model’, ‘text_decoder’)

nemo_automodel.components.distributed.pipelining.hf_utils.MULTIMODAL_SUFFIXES#

(‘vision_tower’, ‘visual’, ‘image_encoder’, ‘vision_encoder’, ‘audio_tower’, ‘audio_encoder’, ‘audio…

nemo_automodel.components.distributed.pipelining.hf_utils.get_text_module(model: torch.nn.Module) torch.nn.Module#

Return the nested text/LLM module if present, else the model itself.

nemo_automodel.components.distributed.pipelining.hf_utils.create_pipeline_forward_inner(
model_class_name: str = 'AutoModel',
) Callable#
nemo_automodel.components.distributed.pipelining.hf_utils.create_pipeline_forward_causal_lm() Callable#
nemo_automodel.components.distributed.pipelining.hf_utils.patch_hf_model_for_pp(
model,
patch_inner_model: bool = True,
patch_causal_lm_model: bool = True,
) None#

Patch a HF model/module to produce pipeline-compatible forward.

  • If model has .model (e.g., LlamaForCausalLM), patch inner and outer.

  • Else, patch the module itself.

nemo_automodel.components.distributed.pipelining.hf_utils.init_hf_model_buffers(
model: torch.nn.Module,
device: torch.device,
) None#
nemo_automodel.components.distributed.pipelining.hf_utils.validate_hf_model_for_pipeline_support(model: torch.nn.Module) None#

Validate if a model is compatible with torch.distributed.pipelining.