LLMs#

Supported LLM Providers#

NeMo Agent toolkit supports the following LLM providers:

Provider

Type

Description

NVIDIA NIM

nim

NVIDIA Inference Microservice (NIM)

OpenAI

openai

OpenAI API

AWS Bedrock

aws_bedrock

AWS Bedrock API

LLM Configuration#

The LLM configuration is defined in the llms section of the workflow configuration file. The _type value refers to the LLM provider, and the model_name value always refers to the name of the model to use.

llms:
  nim_llm:
    _type: nim
    model_name: meta/llama-3.1-70b-instruct
  openai_llm:
    _type: openai
    model_name: gpt-4o-mini
  aws_bedrock_llm:
    _type: aws_bedrock
    model_name: meta/llama-3.1-70b-instruct
    region_name: us-east-1

NVIDIA NIM#

The NIM LLM provider is defined by the NIMModelConfig class.

  • model_name - The name of the model to use

  • temperature - The temperature to use for the model

  • top_p - The top-p value to use for the model

  • max_tokens - The maximum number of tokens to generate

  • api_key - The API key to use for the model

  • base_url - The base URL to use for the model

  • max_retries - The maximum number of retries for the request

OpenAI#

The OpenAI LLM provider is defined by the OpenAIModelConfig class.

  • model_name - The name of the model to use

  • temperature - The temperature to use for the model

  • top_p - The top-p value to use for the model

  • max_tokens - The maximum number of tokens to generate

  • seed - The seed to use for the model

  • api_key - The API key to use for the model

  • base_url - The base URL to use for the model

  • max_retries - The maximum number of retries for the request

AWS Bedrock#

The AWS Bedrock LLM provider is defined by the AWSBedrockModelConfig class.

  • model_name - The name of the model to use

  • temperature - The temperature to use for the model

  • max_tokens - The maximum number of tokens to generate

  • context_size - The context size to use for the model

  • region_name - The region to use for the model

  • base_url - The base URL to use for the model

  • credentials_profile_name - The credentials profile name to use for the model