bridge.models.glm.glm45_bridge#

Module Contents#

Classes#

GLM45Bridge

Megatron Bridge for GLM 4.5 Models.

Data#

API#

bridge.models.glm.glm45_bridge.logger#

‘getLogger(…)’

class bridge.models.glm.glm45_bridge.GLM45Bridge#

Bases: megatron.bridge.models.conversion.model_bridge.MegatronModelBridge

Megatron Bridge for GLM 4.5 Models.

This bridge handles the conversion between HuggingFace Glm4MoeForCausalLM (used for GLM 4.5 models) and Megatron-Core GPTModel formats.

.. rubric:: Example

from megatron.bridge import AutoBridge bridge = AutoBridge.from_hf_pretrained(“zai-org/GLM-4.5”) provider = bridge.to_megatron_provider()

provider_bridge(
hf_pretrained: megatron.bridge.models.hf_pretrained.causal_lm.PreTrainedCausalLM,
) megatron.bridge.models.gpt_provider.GPTModelProvider#

Convert HuggingFace config to GPTModelProvider.

build_conversion_tasks(hf_pretrained, megatron_model)#

Override to store config before mapping_registry is called.

mapping_registry() megatron.bridge.models.conversion.mapping_registry.MegatronMappingRegistry#
_uses_fused_experts() bool#
_hf_expert_suffix(base_name: str) str#
maybe_modify_converted_hf_weight(
task,
converted_weights_dict: dict[str, torch.Tensor],
hf_state_dict,
) dict[str, torch.Tensor]#