bridge.utils.fusions
#
Fusion capability checks for Megatron models.
This module provides functions to check if various fusion optimizations can be enabled based on the current environment and dependencies.
Module Contents#
Functions#
Check if RoPE (Rotary Position Embedding) fusion can be enabled. |
|
Check if gradient accumulation fusion can be enabled. |
|
Check if bias dropout fusion can be enabled. |
|
Check if masked softmax fusion can be enabled. |
|
Validate if RoPE fusion is compatible with the current model configuration. |
Data#
API#
- bridge.utils.fusions.logger#
‘getLogger(…)’
- bridge.utils.fusions.LOG_FUSION_DISABLE#
None
- bridge.utils.fusions.can_enable_apply_rope_fusion() bool #
Check if RoPE (Rotary Position Embedding) fusion can be enabled.
- Returns:
True if RoPE fusion is available and compatible.
- Return type:
bool
- bridge.utils.fusions.can_enable_gradient_accumulation_fusion() bool #
Check if gradient accumulation fusion can be enabled.
- Returns:
True if gradient accumulation fusion is available.
- Return type:
bool
- bridge.utils.fusions.can_enable_bias_dropout_fusion() bool #
Check if bias dropout fusion can be enabled.
- Returns:
True if bias dropout fusion is available.
- Return type:
bool
- bridge.utils.fusions.can_enable_masked_softmax_fusion() bool #
Check if masked softmax fusion can be enabled.
- Returns:
True if masked softmax fusion kernels are available.
- Return type:
bool
- bridge.utils.fusions.validate_rope_fusion_compatibility(
- config: megatron.core.transformer.transformer_config.TransformerConfig,
Validate if RoPE fusion is compatible with the current model configuration.
- Parameters:
model_provider – The GPTModelProvider instance to validate.
- Returns:
True if RoPE fusion is compatible, False otherwise.
- Return type:
bool