NVIDIA Deep Learning Triton Inference Server Documentation - Last updated August 3, 2020 - Send Feedback -

NVIDIA Triton Inference Server


Release Notes
The actual inference server is packaged within the Triton Inference Server container. This document walks you through the process of getting up and running with the Triton inference server container; from the prerequisites to running the container. Additionally, the release notes provide a list of key features, packaged software included in the container, software enhancements and improvements, any known issues, and how to run the Triton Inference Server 1.15.0 for the 20.07 and earlier releases. The Triton inference server container is released monthly to provide you with the latest NVIDIA deep learning software libraries and GitHub code contributions that have been sent upstream; which are all tested, tuned, and optimized.

Inference Server


User Guide
This Triton Inference Server User Guide focuses on documenting the Triton inference server and its benefits. The inference server is included within the inference server container. This guide provides step-by-step instructions for pulling and running the Triton inference server container, along with the details of the model store and the inference API.

Licenses


SLA
This document is the Software License Agreement (SLA) for NVIDIA Triton Inference Server. The following contains specific license terms and conditions for NVIDIA Triton Inference Server. By accepting this agreement, you agree to comply with all the terms and conditions applicable to the specific product(s) included herein.
BSD License
This document is the Berkeley Software Distribution (BSD) license for NVIDIA Triton Inference Server. The following contains specific license terms and conditions for NVIDIA Triton Inference Server open sourced. By accepting this agreement, you agree to comply with all the terms and conditions applicable to the specific product(s) included herein.

Archives


Documentation Archives
This Archives document provides access to previously released Triton inference server documentation versions.