aiq.llm.nim_llm#
Classes#
An NVIDIA Inference Microservice (NIM) llm provider to be used with an LLM client. |
Functions#
|
Module Contents#
- class NIMModelConfig(/, **data: Any)#
Bases:
aiq.data_models.llm.LLMBaseConfigAn NVIDIA Inference Microservice (NIM) llm provider to be used with an LLM client.
Create a new model by parsing and validating input data from keyword arguments.
Raises [
ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.selfis explicitly positional-only to allowselfas a field name.- model_config#
Configuration for the model, should be a dictionary conforming to [
ConfigDict][pydantic.config.ConfigDict].
- max_tokens: pydantic.PositiveInt = None#
- async nim_model(
- llm_config: NIMModelConfig,
- builder: aiq.builder.builder.Builder,