Deployment Guide#
This section describes the various ways you can deploy NIM.
Using Docker, as described in Getting Started
Using a Helm chart, as described in Deploying with Helm
Using Kubernetes, as described in Kubernetes Installation
Using multiple nodes, as described in Multi-node Models
Locally, as described in Air Gap Deployment (offline cache route) and Air Gap Deployment (local model directory route)
Deploying on other platforms#
In addition to the NVIDIA deployment options, you can also deploy on other platforms:
The NIM on Azure Kubernetes Service (AKS) deployment guide provides step-by-step instructions for deploying AKS.
The NIM on Azure Machine Learning (AzureML) deployment guide provides step-by-step instructions for deploying AzureML using Azure CLI and Jupyter Notebook.
The End to End LLM App development with Azure AI Studio, Prompt Flow and NIMs deployment guide provides end-to-end LLM App development with Azure AI Studio, Prompt Flow, and NIMs.
The NIM on AWS Elastic Kubernetes Service (EKS) deployment guide provides step-by-step instructions for deploying on AWS EKS.
The NIM on AWS SageMaker deployment guide provides step-by-step instructions for deploying on AWS SageMaker using Jupyter Notebooks, Python CLI, and the shell.
The NIM on KServe deployment guide provides step-by-step on how to deploy on KServe.