aiq.llm.nim_llm#

Classes#

NIMModelConfig

An NVIDIA Inference Microservice (NIM) llm provider to be used with an LLM client.

Functions#

nim_model(llm_config, builder)

Module Contents#

class NIMModelConfig(/, **data: Any)#

Bases: aiq.data_models.llm.LLMBaseConfig

An NVIDIA Inference Microservice (NIM) llm provider to be used with an LLM client.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

model_config#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

api_key: str | None = None#
base_url: str | None = None#
model_name: str = None#
temperature: float = None#
top_p: float = None#
max_tokens: pydantic.PositiveInt = None#
async nim_model(
llm_config: NIMModelConfig,
builder: aiq.builder.builder.Builder,
)#