TensorRT Release 20.08

The NVIDIA container image for TensorRT, release 20.08, is available on NGC.

Contents of the TensorRT container

This container includes the following:
  • The TensorRT C++ samples and C++ API documentation. The samples can be built by running make in the /workspace/tensorrt/samples directory. The resulting executables are in the /workspace/tensorrt/bin directory. The C++ API documentation can be found in the /workspace/tensorrt/doc/cpp directory.
  • The TensorRT Python samples and Python API documentation. The Python samples can be found in the /workspace/tensorrt/samples/python directory. Many Python samples can be run using python <script.py> -d /workspace/tensorrt/data. For example:
    python caffe_resnet50.py -d /workspace/tensorrt/data
    The Python API documentation can be found in the /workspace/tensorrt/doc/python directory.
  • TensorRT 7.1.3. Note that the ONNX parser and plugin libraries bundled with this container are built from TensorRT Open Source Software: https://github.com/NVIDIA/TensorRT/releases/tag/20.08
The container also includes the following:

Driver Requirements

Release 20.08 is based on NVIDIA CUDA 11.0.3, which requires NVIDIA Driver release 450 or later. However, if you are running on Tesla (for example, T4 or any other Tesla board), you may use NVIDIA driver release 418.xx or 440.30. The CUDA driver's compatibility package only supports particular drivers. For a complete list of supported drivers, see the CUDA Application Compatibility topic. For more information, see CUDA Compatibility and Upgrades.

GPU Requirements

Release 20.08 supports CUDA compute capability 3.5 and higher. This corresponds to GPUs in the Kepler, Maxwell, Pascal, Volta, Turing, and Ampere Architecture GPU families. Specifically, for a list of GPUs that this compute capability corresponds to, see CUDA GPUs. For additional support details, see Deep Learning Frameworks Support Matrix.

Key Features and Enhancements

This TensorRT container release includes the following key features and enhancements.

Announcements

  • Python 2.7 is no longer supported in this TensorRT container release.

Obtaining Missing Data Files

Some samples require data files that are not included within the TensorRT container either due to licensing restrictions or because they are too large. Samples which do not include all the required data files include a README.md file in the corresponding source directory informing you how to obtain the necessary data files.

Installing Required Python Modules

You may need to first run the Python setup script in order to complete some of the samples. The following script has been added to the container to install the missing Python modules and their dependencies if desired: /opt/tensorrt/python/python_setup.sh

Installing Open Source Components

A script has been added to clone, build and replace the provided plugin, Caffe parser, and ONNX parser libraries with the open source ones based off the 20.08 tag on the official TensorRT open source repository.

To install the open source components inside the container, run the following script:

/opt/tensorrt/install_opensource.sh

For more information see GitHub: TensorRT 20.08.

Limitations

NVIDIA TensorRT Container Versions

The following table shows what versions of Ubuntu, CUDA, and TensorRT are supported in each of the NVIDIA containers for TensorRT. For older container versions, refer to the Frameworks Support Matrix.

Known Issues

There are no known issues in this release.