Translate Text Programmatically with Python#
Call the NMT NIM gRPC API directly from Python using the nvidia-riva-client library. This enables you to integrate translation into your applications without relying on the sample CLI scripts.
Prerequisites#
An NMT NIM container deployed and ready. Refer to Deploy and Run NMT NIM.
The NVIDIA Riva Python client installed.
Basic Translation#
Connect to the NMT NIM and translate a single text.
import riva.client
auth = riva.client.Auth(uri="localhost:50051")
nmt_client = riva.client.NeuralMachineTranslationClient(auth)
response = nmt_client.translate(
texts=["Machine learning models require GPU acceleration."],
model="",
source_language="en-US",
target_language="de-DE",
)
for translation in response.translations:
print(translation.text)
The model parameter specifies which model to use. Pass an empty string to use the default deployed model. The texts parameter accepts a list, so you can translate multiple strings in one call.
Batch Translation#
Pass multiple texts in a single request for higher throughput.
import riva.client
auth = riva.client.Auth(uri="localhost:50051")
nmt_client = riva.client.NeuralMachineTranslationClient(auth)
texts = [
"Deploy containers on Kubernetes.",
"Monitor GPU utilization with nvidia-smi.",
"Scale inference with Triton Inference Server.",
"Optimize models with TensorRT.",
]
response = nmt_client.translate(
texts=texts,
model="",
source_language="en-US",
target_language="fr",
)
for i, translation in enumerate(response.translations):
print(f"[{i}] {translation.text}")
Translation with Custom Dictionary Entries#
Pass dnt_phrases_dict to force specific translations or protect terms. The parameter accepts a Python dictionary where each key is the source term and the value is the target translation. Use an empty string as the value to leave the term untranslated.
import riva.client
auth = riva.client.Auth(uri="localhost:50051")
nmt_client = riva.client.NeuralMachineTranslationClient(auth)
response = nmt_client.translate(
texts=["The TensorRT engine optimizes inference on NVIDIA GPUs."],
model="",
source_language="en-US",
target_language="ja",
dnt_phrases_dict={
"TensorRT": "",
"NVIDIA": "",
"inference": "推論",
},
)
print(response.translations[0].text)
“TensorRT” and “NVIDIA” pass through unchanged (empty string value means do not translate). “inference” is forced to “推論” instead of using the model’s default Japanese translation.
This is the same behavior as custom dictionary files, but expressed as a Python dict instead of a file. In dictionary files, TensorRT (no ##) blocks translation, and inference##推論 forces a specific translation. Programmatically, these map to {"TensorRT": "", "inference": "推論"}.
Adjust Output Length for Complex Languages#
For morphologically complex target languages (Arabic, Turkish, Finnish) that produce longer output, pass max_len_variation to prevent truncation.
import riva.client
auth = riva.client.Auth(uri="localhost:50051")
nmt_client = riva.client.NeuralMachineTranslationClient(auth)
response = nmt_client.translate(
texts=["Despite numerous challenges, several countries committed to net-zero by 2050."],
model="",
source_language="en-US",
target_language="ar-AR",
max_len_variation="150",
)
print(response.translations[0].text)
The default is "20" (range: 0-256). This parameter is a string, not an integer.
Query Available Language Pairs#
Use get_config() to discover which models and language pairs are available on the running NMT NIM.
import riva.client
auth = riva.client.Auth(uri="localhost:50051")
nmt_client = riva.client.NeuralMachineTranslationClient(auth)
response = nmt_client.get_config(model="")
print(response)
Pass an empty string to model to list all available models and their language pairs. Pass a specific model name to retrieve only that model’s languages.
This is the same operation as the --list-models flag in the CLI script:
python3 python-clients/scripts/nmt/nmt.py --server 0.0.0.0:50051 --list-models
Translate from a File#
Read inputs from a file and translate them in batches.
import riva.client
auth = riva.client.Auth(uri="localhost:50051")
nmt_client = riva.client.NeuralMachineTranslationClient(auth)
with open("input_text.txt", "r") as f:
lines = [line.strip() for line in f if line.strip()]
batch_size = 8
for i in range(0, len(lines), batch_size):
batch = lines[i : i + batch_size]
response = nmt_client.translate(
texts=batch,
model="",
source_language="en-US",
target_language="es-ES",
)
for translation in response.translations:
print(translation.text)
Error Handling#
Wrap translation calls with gRPC error handling to catch common issues such as unsupported language codes or server unavailability.
import grpc
import riva.client
auth = riva.client.Auth(uri="localhost:50051")
nmt_client = riva.client.NeuralMachineTranslationClient(auth)
try:
response = nmt_client.translate(
texts=["Hello, world!"],
model="",
source_language="en-US",
target_language="de-DE",
)
print(response.translations[0].text)
except grpc.RpcError as e:
print(f"gRPC error: {e.code()} - {e.details()}")
Common gRPC status codes:
Code |
Meaning |
|---|---|
|
The NMT NIM is not running or not yet ready. Check the health endpoint. |
|
Invalid language code or empty text. Refer to supported languages. |
|
Duplicate request. |
|
Batch too large or GPU memory exhausted. Reduce batch size. |
Connecting with TLS#
If the NMT NIM is deployed with TLS enabled, pass certificate file paths to riva.client.Auth.
import riva.client
auth = riva.client.Auth(
uri="localhost:50051",
use_ssl=True,
ssl_root_cert="ssl_ca_cert.pem",
ssl_client_cert="ssl_cert_client.pem",
ssl_client_key="ssl_key_client.pem",
)
nmt_client = riva.client.NeuralMachineTranslationClient(auth)
response = nmt_client.translate(
texts=["Secure translation request."],
model="",
source_language="en-US",
target_language="fr",
)
print(response.translations[0].text)
The ssl_root_cert, ssl_client_cert, and ssl_client_key parameters accept file paths to PEM-encoded certificate files. For server-side TLS only, you can omit ssl_client_cert and ssl_client_key.
translate() Method Reference#
Parameter |
Type |
Required |
Description |
|---|---|---|---|
|
|
Yes |
Input texts to translate. |
|
|
Yes |
Model name. Use |
|
|
Yes |
Source language code (for example, |
|
|
Yes |
Target language code (for example, |
|
|
No |
If |
|
|
No |
Dictionary of |
|
|
No |
Maximum token count difference between source and output ( |
Next Steps#
Custom Dictionaries: Build domain-specific glossaries for controlled translations.
NMT API Reference: Full gRPC protobuf specification for
TranslateTextandListSupportedLanguagePairs.Deploy and Run NMT NIM: Container deployment and CLI client reference.