Class ManagerInfer
- Defined in File infer_manager.hpp 
- 
class ManagerInfer
- Manager class for inference. - Public Functions - 
ManagerInfer()
- Default Constructor. 
 - 
~ManagerInfer()
- Destructor. 
 - Create inference settings and memory. - Parameters
- inference_specs – specifications for inference 
- Returns
- InferStatus with appropriate code and message 
 
 - Prepares and launches single/multiple inference. - The provided CUDA stream is used to prepare the input data and will be used to operate on the output data, any execution of CUDA work should be in sync with this stream. - Parameters
- inference_specs – specifications for inference 
- cuda_stream – CUDA stream 
 
- Returns
- InferStatus with appropriate code and message 
 
 - 
InferStatus run_core_inference(const std::string &model_name, const DataMap &permodel_preprocess_data, const DataMap &permodel_output_data, cudaStream_t cuda_stream)
- Executes Core inference for a particular model and generates inferred data The provided CUDA stream is used to prepare the input data and will be used to operate on the output data, any execution of CUDA work should be in sync with this stream. - Parameters
- model_name – Input model to do the inference on 
- permodel_preprocess_data – Input DataMap with model name as key and DataBuffer as value 
- permodel_output_data – Output DataMap with tensor name as key and DataBuffer as value 
- cuda_stream – CUDA stream 
 
- Returns
- InferStatus with appropriate code and message 
 
 - 
void cleanup()
- Cleans up internal context per model. 
 - 
DimType get_input_dimensions() const
- Get input dimension per model. - Returns
- Map with model name as key and dimension as value 
 
 - 
DimType get_output_dimensions() const
- Get output dimension per tensor. - Returns
- Map with tensor name as key and dimension as value 
 
 
- 
ManagerInfer()