aiq.llm.nim_llm#
Classes#
An NVIDIA Inference Microservice (NIM) llm provider to be used with an LLM client. |
Functions#
|
Module Contents#
- class NIMModelConfig(/, **data: Any)#
Bases:
aiq.data_models.llm.LLMBaseConfig
An NVIDIA Inference Microservice (NIM) llm provider to be used with an LLM client.
Create a new model by parsing and validating input data from keyword arguments.
Raises [
ValidationError
][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.self
is explicitly positional-only to allowself
as a field name.- model_config#
Configuration for the model, should be a dictionary conforming to [
ConfigDict
][pydantic.config.ConfigDict].
- max_tokens: pydantic.PositiveInt = None#
- async nim_model(
- llm_config: NIMModelConfig,
- builder: aiq.builder.builder.Builder,