Custom Parsers and Chat Templates#
NIM LLM supports vLLM CLI arguments that take file paths — --reasoning-parser-plugin, --tool-parser-plugin, and --chat-template. NIM automatically resolves these paths so you can reference plugin files by bare filename without knowing the container’s internal directory layout.
Built-in Parsers (No Plugin File Required)#
Most users do not need a plugin file. vLLM ships with a large set of built-in parsers that you select by name. Use the plugin arguments only when you have a custom parser that is not in the built-in list.
# Built-in parser - no plugin file needed
nim-serve --reasoning-parser nemotron_v3
# Custom parser - plugin file required
nim-serve --reasoning-parser-plugin my_custom_parser.py --reasoning-parser my_custom
Built-in Reasoning Parsers#
Pass any of these names to --reasoning-parser (no --reasoning-parser-plugin needed):
deepseek_r1, deepseek_v3, ernie45, glm45, granite, holo2, hunyuan_a13b, kimi_k2, minimax_m2, minimax_m2_append_think, mistral, nemotron_v3, olmo3, openai_gptoss, qwen3, seed_oss, step3, step3p5.
For the authoritative list, refer to the vLLM reasoning parser registry.
Built-in Tool-Call Parsers#
Pass any of these names to --tool-call-parser (no --tool-parser-plugin needed):
deepseek_v3, deepseek_v31, ernie45, glm45, glm47, granite, granite-20b-fc, granite4, hermes, hunyuan_a13b, internlm, jamba, kimi_k2, llama3_json, llama4_json, llama4_pythonic, longcat, minimax, minimax_m2, mistral, olmo3, openai, phi4_mini_json, pythonic, qwen3_coder, qwen3_xml, seed_oss, step3, step3p5, xlam.
For the authoritative list, refer to the vLLM tool-call parser registry.
Note
Built-in parser names are vLLM internal identifiers — they may differ from the names shown on a model card. For example, the model card for Nemotron-3 Nano references a nano_v3 parser shipped as an external .py file, but NIM LLM ships a built-in nemotron_v3 parser that works without any plugin file. When in doubt, try the built-in name first.
Providing a Custom Plugin File#
When you do need a custom parser or chat template, NIM LLM accepts the file in three ways. Pick whichever matches your deployment:
Option 1: Place the File in the Model Checkpoint Directory (Auto-Discovered)#
If the model checkpoint includes the plugin file (typical when the model is published with a custom parser), pass the bare filename. NIM LLM finds it automatically.
nim-serve \
--reasoning-parser-plugin nano_v3_reasoning_parser.py \
--reasoning-parser nano_v3
NIM LLM resolves the value to the absolute path inside the checkpoint directory and logs the resolution at INFO level:
INFO: Plugin --reasoning-parser-plugin: resolved 'nano_v3_reasoning_parser.py' to /tmp/nim_abc123/nano_v3_reasoning_parser.py
Option 2: Volume-Mount at an Absolute Path#
Mount your plugin file (or directory) into the container and reference it by absolute path. NIM LLM passes absolute paths through unchanged.
docker run --gpus all \
-v /home/user/plugins:/mnt/plugins:ro \
-p 8000:8000 \
${NIM_LLM_MODEL_FREE_IMAGE}:2.0.4 \
nim-serve \
--reasoning-parser-plugin /mnt/plugins/my_custom_parser.py \
--reasoning-parser my_custom
Option 3: Use NIM_PASSTHROUGH_ARGS (Kubernetes / Helm)#
In environments where CLI arguments are not available, use the NIM_PASSTHROUGH_ARGS environment variable. The same resolution rules apply.
export NIM_PASSTHROUGH_ARGS="--reasoning-parser-plugin nano_v3_reasoning_parser.py --reasoning-parser nano_v3"
Resolution Order and Precedence#
For each file-path argument, NIM LLM applies the following rules in order:
URL-shaped values (containing
://, e.g.,https://...,s3://...) pass through to vLLM unchanged. NIM LLM does not attempt remote resolution.Built-in names (no file extension and no path separator, e.g.,
tool_chat_template_mistral) pass through to vLLM unchanged.Literal path exists (absolute or relative to the working directory) — used as-is, converted to an absolute path.
Bare filename in checkpoint dir — if
<checkpoint_dir>/<filename>exists, used and logged.Not found anywhere — NIM LLM exits with a non-zero code and a clear error listing all searched locations.
Conflict: Same Filename in Both the Working Directory and Checkpoint#
If the same filename exists at both the literal path (typically /opt/nim/) and the checkpoint directory, the literal path wins and a warning is logged. This can happen when a derived Dockerfile copies a file into /opt/nim/, or when a Kubernetes ConfigMap is mounted at the working directory.
WARNING: Plugin --reasoning-parser-plugin: resolved 'my_parser.py' to /opt/nim/my_parser.py
(a copy also exists at /tmp/nim_abc123/my_parser.py — use an absolute path to select a specific one)
To force a specific file, pass an absolute path:
nim-serve \
--reasoning-parser-plugin /tmp/nim_abc123/my_parser.py \
--reasoning-parser my_custom
Troubleshooting#
Error: “Plugin file for … not found”#
If you see an error like the following, the plugin file does not exist at any searched location:
ERROR: Plugin file for --reasoning-parser-plugin not found: 'my_parser.py'.
Searched locations:
- /opt/nim/my_parser.py
- /tmp/nim_abc123/my_parser.py
To provide a custom plugin, either:
1. Place the file in the model checkpoint directory
2. Volume-mount it and use an absolute path
3. Use a built-in parser name (no file extension needed)
Verify one of the following:
The file is included in the model checkpoint at the expected path.
The volume mount is correctly configured and uses an absolute path.
A built-in parser name (without
.pyextension) covers your model — see the lists above.
Wrong Plugin File Loaded#
If the WARNING about precedence appears in the logs and you intended to use the checkpoint version, pass an absolute path to the checkpoint file as shown in the conflict section above.