bridge.diffusion.conversion.flux.flux_hf_pretrained#
Module Contents#
Classes#
FLUX-specific state source that writes exported HF shards under ‘transformer/’. |
|
Lightweight pretrained wrapper for Diffusers FLUX models. |
API#
- class bridge.diffusion.conversion.flux.flux_hf_pretrained.FluxSafeTensorsStateSource#
Bases:
megatron.bridge.models.hf_pretrained.state.SafeTensorsStateSourceFLUX-specific state source that writes exported HF shards under ‘transformer/’.
- save_generator(generator, output_path, strict: bool = True)#
- class bridge.diffusion.conversion.flux.flux_hf_pretrained.PreTrainedFlux(
- model_name_or_path: Union[str, pathlib.Path],
- **kwargs,
Bases:
megatron.bridge.models.hf_pretrained.base.PreTrainedBaseLightweight pretrained wrapper for Diffusers FLUX models.
Provides access to FLUX config and state through the common PreTrainedBase API so bridges can consume
.configand.stateuniformly.NOTE: Due to FLUX uses HF’s Diffusers library, which has different checkpoint directory structure to HF’s Transformer library, we need a wrapper to load the model weights and config from the correct directory (e.g., ./transformer). The diffusers’s structure includes all components in the diffusion pipeline (VAE, text encoders, etc.). The actual transformer weights are stored in the ./transformer directory. Hence, we adjust the input and output path directory accordingly. We also need to override the save_artifacts method to save relevant correct configs files to the corresponding directory.
Initialization
- property model_name_or_path: str#
- _load_model() diffusers.FluxTransformer2DModel#
- _load_config() transformers.AutoConfig#
- property state: megatron.bridge.models.hf_pretrained.state.StateDict#
FLUX-specific StateDict that reads safetensors from the fixed ‘transformer/’ subfolder.
- save_artifacts(save_directory: Union[str, pathlib.Path])#
Save FLUX artifacts (currently config) alongside exported weights. Writes transformer/config.json into the destination.