NVIDIA Holoscan SDK v2.5.0
Holoscan v2.5.0

Class ManagerInfer

class ManagerInfer

Manager class for inference.

Public Functions

ManagerInfer()

Default Constructor.

~ManagerInfer()

Destructor.

InferStatus set_inference_params(std::shared_ptr<InferenceSpecs> &inference_specs)

Create inference settings and memory.

Parameters

inference_specs – specifications for inference

Returns

InferStatus with appropriate code and message

InferStatus execute_inference(std::shared_ptr<InferenceSpecs> &inference_specs)

Prepares and launches single/multiple inference.

Parameters

inference_specs – specifications for inference

Returns

InferStatus with appropriate code and message

InferStatus run_core_inference(const std::string &model_name, DataMap &permodel_preprocess_data, DataMap &permodel_output_data)

Executes Core inference for a particular model and generates inferred data.

Parameters
  • model_name – Input model to do the inference on

  • permodel_preprocess_data – Input DataMap with model name as key and DataBuffer as value

  • permodel_output_data – Output DataMap with tensor name as key and DataBuffer as value

Returns

InferStatus with appropriate code and message

void cleanup()

Cleans up internal context per model.

DimType get_input_dimensions() const

Get input dimension per model.

Returns

Map with model name as key and dimension as value

DimType get_output_dimensions() const

Get output dimension per tensor.

Returns

Map with tensor name as key and dimension as value

Previous Class Logger
Next Class ManagerProcessor
© Copyright 2022-2024, NVIDIA. Last updated on Oct 1, 2024.