bridge.models.gemma.gemma_bridge
#
Module Contents#
Classes#
Megatron Bridge for Gemma Causal LM. |
API#
- class bridge.models.gemma.gemma_bridge.GemmaBridge#
Bases:
megatron.bridge.models.conversion.model_bridge.MegatronModelBridge
Megatron Bridge for Gemma Causal LM.
This bridge handles the conversion between HuggingFace GemmaForCausalLM and Megatron-Core GPTModel formats, including weight mappings and configuration translation.
As a user you would not use this bridge directly, but through
AutoBridge
... rubric:: Example
from megatron.bridge import AutoBridge bridge = AutoBridge.from_hf_pretrained(“google/gemma-2b”) provider = bridge.to_megatron_provider()
- provider_bridge(
- hf_pretrained: megatron.bridge.models.hf_pretrained.causal_lm.PreTrainedCausalLM,
Convert HuggingFace config to GemmaModelProvider.
- Parameters:
hf_pretrained – HuggingFace pretrained model wrapper
- Returns:
Configured provider for Megatron model
- Return type:
- mapping_registry() megatron.bridge.models.conversion.mapping_registry.MegatronMappingRegistry #
Return MegatronMappingRegistry containing parameter mappings from HF to Megatron format.
- Returns:
Registry of parameter mappings
- Return type: