bridge.training.flex_dispatcher_backend#
Module Contents#
Functions#
Apply DeepEP or HybridEP optimizations to the model config. |
|
Validate DeepEP or HybridEP is supported for the current GPU architecture. |
Data#
API#
- bridge.training.flex_dispatcher_backend.logger: logging.Logger#
‘getLogger(…)’
- bridge.training.flex_dispatcher_backend.apply_flex_dispatcher_backend(
- model_config: megatron.core.transformer.TransformerConfig,
- moe_flex_dispatcher_backend: str | None = None,
Apply DeepEP or HybridEP optimizations to the model config.
DeepEP is applicable only to MoE models on Ampere and Hopper GPUs. HybridEP is applicable only to MoE models on GB200 GPUs with NVL72.
- bridge.training.flex_dispatcher_backend.validate_flex_dispatcher_backend(
- model_config: megatron.core.transformer.TransformerConfig,
Validate DeepEP or HybridEP is supported for the current GPU architecture.