Class InferBase
Defined in File infer.hpp
Derived Types
public holoscan::inference::OnnxInfer(Class OnnxInfer)public holoscan::inference::TorchInfer(Class TorchInfer)public holoscan::inference::TrtInfer(Class TrtInfer)
- 
class InferBase
 Base Inference Class.
Subclassed by holoscan::inference::OnnxInfer, holoscan::inference::TorchInfer, holoscan::inference::TrtInfer
Public Functions
- 
virtual ~InferBase() = default
 Default destructor.
Does the Core inference The provided CUDA data event is used to prepare the input data any execution of CUDA work should be in sync with this event. If the inference is using CUDA it should record a CUDA event and pass it back in
cuda_event_inference.- Parameters
 input_data – Input DataBuffer
output_buffer – Output DataBuffer, is populated with inferred results
cuda_event_data – CUDA event recorded after data transfer
cuda_event_inference – CUDA event recorded after inference
- Returns
 InferStatus
- 
inline virtual std::vector<std::vector<int64_t>> get_input_dims() const
 Get input data dimensions to the model.
- Returns
 Vector of values as dimension
- 
inline virtual std::vector<std::vector<int64_t>> get_output_dims() const
 Get output data dimensions from the model.
- Returns
 Vector of input dimensions. Each dimension is a vector of int64_t corresponding to the shape of the input tensor.
- 
inline virtual std::vector<holoinfer_datatype> get_input_datatype() const
 Get input data types from the model.
- Returns
 Vector of input dimensions. Each dimension is a vector of int64_t corresponding to the shape of the input tensor.
- 
inline virtual std::vector<holoinfer_datatype> get_output_datatype() const
 Get output data types from the model.
- Returns
 Vector of values as datatype per output tensor
- 
inline virtual void cleanup()
 
- 
virtual ~InferBase() = default