nat.llm.huggingface_inference_llm#
Classes#
HuggingFace Inference API LLM provider for remote model inference. |
Functions#
|
Register HuggingFace Inference API as an LLM provider. |
Module Contents#
- class HuggingFaceInferenceLLMConfig#
Bases:
nat.data_models.llm.LLMBaseConfig,nat.data_models.retry_mixin.RetryMixin,nat.data_models.optimizable.OptimizableMixin,nat.data_models.thinking_mixin.ThinkingMixinHuggingFace Inference API LLM provider for remote model inference.
Supports: - Serverless Inference API (default) - Dedicated Inference Endpoints (via endpoint_url) - Self-hosted TGI servers (via endpoint_url)
- model_config#
- api_key: nat.data_models.common.OptionalSecretStr = None#
- async huggingface_inference_provider(
- config: HuggingFaceInferenceLLMConfig,
- _builder: nat.builder.builder.Builder,
Register HuggingFace Inference API as an LLM provider.