NVIDIA DeepStream SDK API Reference6.0.1 Release |
Defines an API for importing Transfer Learning Toolkit encoded models.
Functions | |
bool | NvDsInferCudaEngineGetFromTltModel (nvinfer1::IBuilder *const builder, nvinfer1::IBuilderConfig *const builderConfig, const NvDsInferContextInitParams *const initParams, nvinfer1::DataType dataType, nvinfer1::ICudaEngine *&cudaEngine) |
Decodes and creates a CUDA engine file from a TLT encoded model. More... | |
bool NvDsInferCudaEngineGetFromTltModel | ( | nvinfer1::IBuilder *const | builder, |
nvinfer1::IBuilderConfig *const | builderConfig, | ||
const NvDsInferContextInitParams *const | initParams, | ||
nvinfer1::DataType | dataType, | ||
nvinfer1::ICudaEngine *& | cudaEngine | ||
) |
Decodes and creates a CUDA engine file from a TLT encoded model.
This function implements the NvDsInferCudaEngineGet interface. The correct key and model path must be provided in the tltModelKey and tltEncodedModelFilePath members of initParams. Other parameters applicable to UFF models also apply to TLT encoded models.