Customize a Workflow#
Prerequisites#
Set up your environment by following the instructions in the Install From Source section of the install guide.
Install NVIDIA NeMo Agent toolkit and the Simple example workflow.
uv pip install -e . uv pip install -e examples/getting_started/simple_web_query
This tutorial assumes familiarity with workflows and the command line interface.
Customizing the examples/getting_started/simple_web_query
Workflow#
The examples/getting_started/simple_web_query
workflow is defined by the examples/getting_started/simple_web_query/configs/config.yml
configuration file, which you can examine in the configuration file contents.
examples/getting_started/simple_web_query/configs/config.yml
:
functions:
webpage_query:
_type: webpage_query
webpage_url: https://docs.smith.langchain.com
description: "Search for information about LangSmith. For any questions about LangSmith, you must use this tool!"
embedder_name: nv-embedqa-e5-v5
chunk_size: 512
current_datetime:
_type: current_datetime
llms:
nim_llm:
_type: nim
model_name: meta/llama-3.1-70b-instruct
temperature: 0.0
embedders:
nv-embedqa-e5-v5:
_type: nim
model_name: nvidia/nv-embedqa-e5-v5
workflow:
_type: react_agent
tool_names: [webpage_query, current_datetime]
llm_name: nim_llm
verbose: true
parse_agent_response_max_retries: 3
The workflow contains two tools: one that queries the LangSmith User Guide, and another that returns the current date and time. It also contains two models: an embedding model and an LLM model. After running the workflow, you can query it for information about LangSmith. This tutorial demonstrates how to customize this workflow.
Each workflow contains several configuration parameters that can be modified to customize the workflow. While copying and modifying the file is possible, it is not always necessary as some parameters can be overridden using the --override
flag.
Examining the examples/getting_started/simple_web_query/configs/config.yml
file, the llms
section is as follows:
llms:
nim_llm:
_type: nim
model_name: meta/llama-3.1-70b-instruct
temperature: 0.0
To override the temperature
parameter for the nim_llm
, the following command can be used:
nat run --config_file examples/getting_started/simple_web_query/configs/config.yml --input "What is LangSmith?" \
--override llms.nim_llm.temperature 0.7
When successful, the output contains the following line:
nat.cli.cli_utils.config_override - INFO - Successfully set override for llms.nim_llm.temperature with value: 0.7
The --override
flag can be specified multiple times, allowing the ability to override multiple parameters. For example, the llama-3.1-70b-instruct
model can be replaced with the llama-3.3-70b-instruct
using:
nat run --config_file examples/getting_started/simple_web_query/configs/config.yml --input "What is LangSmith?" \
--override llms.nim_llm.temperature 0.7 \
--override llms.nim_llm.model_name meta/llama-3.3-70b-instruct
Note
Not all parameters are specified in the workflow YAML. For each tool, there are potentially multiple optional parameters with default values that can be overridden. The nat info components
command can be used to list all available parameters. In this case, to list all available parameters for the LLM nim
type run:
nat info components -t llm_provider -q nim