bridge.models.bailing.bailing_moe2_bridge#

Megatron Bridge for Ling MoE2 Model.

This module provides the bridge implementation for converting between HuggingFace Bailing MoE2 models and Megatron-Core format.

Supported models:

  • inclusionAI/Ling-mini-base-2.0-5T

  • inclusionAI/Ling-mini-base-2.0-10T

  • inclusionAI/Ling-mini-base-2.0-15T

  • inclusionAI/Ling-mini-base-2.0-20T

  • inclusionAI/Ling-mini-base-2.0

  • inclusionAI/Ling-mini-2.0

  • inclusionAI/Ling-flash-base-2.0

  • inclusionAI/Ling-flash-2.0

  • inclusionAI/Ling-1T

Module Contents#

Classes#

BailingMoeV2Bridge

Megatron Bridge for Ling MoE V2 Model

Data#

API#

bridge.models.bailing.bailing_moe2_bridge.logger#

‘getLogger(…)’

class bridge.models.bailing.bailing_moe2_bridge.BailingMoeV2Bridge#

Bases: megatron.bridge.models.conversion.model_bridge.MegatronModelBridge

Megatron Bridge for Ling MoE V2 Model

.. rubric:: Example

from megatron.bridge import AutoBridge bridge = AutoBridge.from_hf_pretrained(“inclusionAI/Ling-mini-2.0”) provider = bridge.to_megatron_provider()

provider_bridge(
hf_pretrained: megatron.bridge.models.hf_pretrained.causal_lm.PreTrainedCausalLM,
) megatron.bridge.models.gpt_provider.GPTModelProvider#
mapping_registry() megatron.bridge.models.conversion.mapping_registry.MegatronMappingRegistry#