NVIDIA DeepStream SDK API Reference

4.0.2 Release

 All Data Structures Files Functions Variables Typedefs Enumerations Enumerator Macros Groups Pages
nvdsinfer_tlt.h File Reference

Detailed Description

NVIDIA DeepStream API for importing Tranfer Learning Toolkit encoded models.

Description: This file specifies the API to decode and create CUDA engine file from a Tranfer Learning Toolkit(TLT) encoded model.

Definition in file nvdsinfer_tlt.h.

Go to the source code of this file.

Functions

bool NvDsInferCudaEngineGetFromTltModel (nvinfer1::IBuilder *builder, NvDsInferContextInitParams *initParams, nvinfer1::DataType dataType, nvinfer1::ICudaEngine *&cudaEngine)
 API to decode and create CUDA engine file from a TLT encoded model. More...
 

Function Documentation

bool NvDsInferCudaEngineGetFromTltModel ( nvinfer1::IBuilder *  builder,
NvDsInferContextInitParams initParams,
nvinfer1::DataType  dataType,
nvinfer1::ICudaEngine *&  cudaEngine 
)

API to decode and create CUDA engine file from a TLT encoded model.

This API is an implmentation of NvDsInferCudaEngineGet interface. The correct key and model path should be provided in the tltModelKey and tltEncodedModelFilePath members of initParams. Other parameters applicable to UFF models also apply to TLT encoded models.