core.models.gpt.moe_module_specs#
Module Contents#
Functions#
Helper function to get module spec for MoE |
|
Helper function to get module spec for MoE |
API#
- core.models.gpt.moe_module_specs.get_moe_module_spec(
- use_te: Optional[bool] = True,
- num_experts: Optional[int] = None,
- moe_grouped_gemm: Optional[bool] = False,
- moe_use_legacy_grouped_gemm: Optional[bool] = False,
Helper function to get module spec for MoE
- core.models.gpt.moe_module_specs.get_moe_module_spec_for_backend(
- backend: megatron.core.models.backends.BackendSpecProvider,
- num_experts: Optional[int] = None,
- moe_grouped_gemm: Optional[bool] = False,
- moe_use_legacy_grouped_gemm: Optional[bool] = False,
- use_te_activation_func: bool = False,
Helper function to get module spec for MoE