Get Started with NVIDIA AgentIQ#
This guide will help you set up your development environment, run existing workflows, and create your own custom workflows using the aiq
command-line interface.
Supported LLM APIs:#
NIM (such as Llama-3.1-70b-instruct and Llama-3.3-70b-instruct)
OpenAI
Supported LLM Frameworks:#
LangChain
LlamaIndex
CrewAI
Semantic Kernel
Installing AgentIQ#
To run the examples, you need to install AgentIQ from source. For more information on installing AgentIQ from source, refer to Install From Source
Obtaining API Keys#
Depending which workflows you are running, you may need to obtain API keys from the respective services. Most AgentIQ workflows require an NVIDIA API key defined with the NVIDIA_API_KEY
environment variable. An API key can be obtained by visiting build.nvidia.com
and creating an account.
Running Example Workflows#
Before running any of the AgentIQ examples, set your NVIDIA API key as an environment variable to access NVIDIA AI services.
export NVIDIA_API_KEY=<YOUR_API_KEY>
Note
Replace <YOUR_API_KEY>
with your actual NVIDIA API key.
Running the Simple Workflow#
Install the
aiq_simple
Workflowuv pip install -e examples/simple
Run the
aiq_simple
Workflowaiq run --config_file=examples/simple/configs/config.yml --input "What is LangSmith"
Run and evaluate the
aiq_simple
WorkflowThe
eval_config.yml
YAML is a super-set of theconfig.yml
containing additional fields for evaluation. To evaluate theaiq_simple
workflow, run the following command:aiq eval --config_file=examples/simple/configs/eval_config.yml
Next Steps#
AgentIQ contains several examples which demonstrate how AgentIQ can be used to build custom workflows and tools. These examples are located in the
examples
directory of the AgentIQ repository.Refer to the AgentIQ Guides for more detailed information on how to use AgentIQ.