TensorRT inference server Release 18.07 Beta

The NVIDIA container image of the TensorRT inference server, release 18.07, is available as a beta release.

Contents of the TensorRT inference server

This container image contains the TensorRT inference server executable in /opt/inference_server.

The container also includes the following:

Driver Requirements

Release 18.07 is based on CUDA 9, which requires NVIDIA Driver release 384.xx.

Key Features and Enhancements

This TensorRT inference server release includes the following key features and enhancements.

Known Issues

There are no known issues in this release.