nemo_automodel._transformers.registry#
Module Contents#
Classes#
Lazy-loading mapping from architecture name to model class. |
|
Functions#
Data#
API#
- nemo_automodel._transformers.registry.logger#
‘getLogger(…)’
- nemo_automodel._transformers.registry.MODEL_ARCH_MAPPING#
‘OrderedDict(…)’
- nemo_automodel._transformers.registry._CUSTOM_CONFIG_REGISTRATIONS: Dict[str, Tuple[str, str]]#
None
- nemo_automodel._transformers.registry._register_custom_configs() None#
- class nemo_automodel._transformers.registry._LazyArchMapping(
- auto_map: Union[collections.OrderedDict, Dict[str, tuple], None] = None,
Lazy-loading mapping from architecture name to model class.
Inspired by HuggingFace transformers’
_LazyAutoMapping. Entries from the staticauto_mapare imported on first access and cached. Additional entries can be added at runtime viaregister.Initialization
- _load(key: str) Type[torch.nn.Module]#
- __contains__(key: str) bool#
- __getitem__(key: str) Type[torch.nn.Module]#
- __setitem__(key: str, value: Type[torch.nn.Module]) None#
- register(
- key: str,
- value: Type[torch.nn.Module],
- exist_ok: bool = False,
Register a model class under the given architecture name.
- has_tag(key: str, tag: str) bool#
Return
Trueif key was registered with tag.
- keys_with_tag(tag: str) set#
Return all architecture names that have tag.
- keys()#
- __len__() int#
- __repr__() str#
- class nemo_automodel._transformers.registry._ModelRegistry#
- model_arch_name_to_cls: nemo_automodel._transformers.registry._LazyArchMapping#
‘field(…)’
- _retrieval_archs: set#
‘field(…)’
- __post_init__()#
- property supported_models#
- get_model_cls_from_model_arch(
- model_arch: str,
- has_custom_model(arch_name: str) bool#
Return
Trueif arch_name has a custom (non-HF) implementation.
- has_retrieval_model(arch_name: str) bool#
Return
Trueif arch_name is a registered retrieval/encoder architecture.
- register_retrieval(arch_name: str) None#
Mark arch_name as a retrieval/encoder architecture.
- resolve_custom_model_cls(
- architecture: str,
- config,
Return the custom model class if it exists and supports config, else
None.Custom model classes may define a
supports_config(config)classmethod to opt out for specific HF configs (e.g. a Mistral3 VLM with a dense Ministral3 text backbone instead of the expected Mistral4 MoE+MLA).
- register(
- arch_name: str,
- model_cls: Type[torch.nn.Module],
- exist_ok: bool = False,
Register a custom model class for a given architecture name.
- nemo_automodel._transformers.registry.get_registry()#
- nemo_automodel._transformers.registry.ModelRegistry#
‘get_registry(…)’