TensorRT 3.0 Python API Documentation

TensorRT 3.0.0 includes support for Python, allowing developers to integrate a TensorRT inference engine into a python development enviorment or to experiment with TensorRT in an accessible fashion.

TensorRT 3.0.0 includes the Universial Framework Format (UFF), a way to convert models trained in Tensorflow and soon other framworks like Caffe2 to a single model format. TensorRT is also now able to generate engines from UFF models.

Installing TensorRT and the UFF Toolkit

Workflows and Use Cases

There are a couple use cases for which the Python API for TensorRT excels at.

We have included a couple sample applications with the TensorRT Python package which cover many of these use cases.

  1. There is an existing Tensorflow (or other UFF compatable framework) model that a developer wants to try out with TensorRT.
  2. There is a Caffe1 model that a developer want to try out with TensorRT
  3. A developer wants to deploy a TensorRT engine as a part of a larger application such as a web backend
  4. A developer wants to try out TensorRT with a model trained with a framework not currently supported by UFF and not trained by Caffe1.

Indices and tables