bridge.models.deepseek.deepseek_v3_bridge#
Module Contents#
Classes#
Megatron Bridge for DeepSeek-V3. |
API#
- class bridge.models.deepseek.deepseek_v3_bridge.DeepSeekV3Bridge#
Bases:
megatron.bridge.models.conversion.model_bridge.MegatronModelBridgeMegatron Bridge for DeepSeek-V3.
As a user you would not use this bridge directly, but through
AutoBridge... rubric:: Example
from megatron.bridge import AutoBridge bridge = AutoBridge.from_hf_pretrained(ādeepseek-ai/DeepSeek-V3-Baseā, trust_remote_code=True) provider = bridge.to_megatron_provider()
- provider_bridge(
- hf_pretrained: megatron.bridge.models.hf_pretrained.causal_lm.PreTrainedCausalLM,
- mapping_registry() megatron.bridge.models.conversion.mapping_registry.MegatronMappingRegistry#
- maybe_modify_converted_hf_weight(
- task: megatron.bridge.models.conversion.model_bridge.WeightConversionTask,
- converted_weights_dict: Dict[str, torch.Tensor],
- hf_state_dict: Mapping[str, torch.Tensor],
Add rotary embedding inverse frequency parameter if needed but not present. This is needed for moonshotai related models (e.g., Moonlight-16B-A3B-Instruct).