core.transformer.pipeline_parallel_layer_layout#
Module Contents#
Classes#
Configuration of custom pipeline parallel layer partitioning. |
Data#
API#
- core.transformer.pipeline_parallel_layer_layout.logger#
‘getLogger(…)’
- class core.transformer.pipeline_parallel_layer_layout.PipelineParallelLayerLayout(
- layout: str | list,
- pipeline_model_parallel_size: int,
Configuration of custom pipeline parallel layer partitioning.
Initialization
Initialize PipelineParallelLayerLayout from a list or a str. Format validation will be done here.
- __repr__() str#
- validate_layer_layout(num_layers: int, mtp_num_layers: int)#
Check whether the layout is valid.
- get_num_layers_to_build(
- layer_type: megatron.core.transformer.enums.LayerType = LayerType.decoder,
- vp_stage: Optional[int] = None,
- pp_rank: Optional[int] = None,
Get the number of layers to build in the pipeline stage
- get_layer_offset(
- layer_type: megatron.core.transformer.enums.LayerType = LayerType.decoder,
- vp_stage: Optional[int] = None,
- pp_rank: Optional[int] = None,
Get the layer offset in the pipeline stage
- get_layer_id_list(
- layer_type: megatron.core.transformer.enums.LayerType = LayerType.decoder,
- vp_stage: Optional[int] = None,
- pp_rank: Optional[int] = None,
Get the list of layer_id for each layer in the pipeline stage.
- pretty_repr()#
Pretty representation of the custom layout, showing the layers held by each stage. .. rubric:: Example
VPP rank 0 VPP rank 1
PP rank 0 embedding,decoder2 decoder2 PP rank 1-13 decoder2 decoder2 PP rank 14 decoder2 mtp PP rank 15 decoder2 loss
- static from_str(layout, pipeline_model_parallel_size)#
Parse the pipeline model parallel layout from a string.
- static get_num_stages_from_str(layout: str)#
Get the number of PP * VPP stages from a layout string.
- static parse_str_to_list(layout_str: str)#
Parse a layout string to a list of lists. Example: “Ettt|(tt|)*29,m|L” will be parsed to [[“E”,”t”,”t”,”t”]]+[[“t”,”t”]]*29+[[“m”],[“L”]]