Class InferRequestedOutput¶
Defined in File common.h
Class Documentation¶
-
class
InferRequestedOutput
¶ An InferRequestedOutput object is used to describe the requested model output for inference.
Public Functions
-
const std::string &
Name
() const¶ Gets name of the associated output tensor.
- Return
The name of the tensor.
-
size_t
ClassificationCount
() const¶ Get the number of classifications requested for this output, or 0 if the output is not being returned as classifications.
Set the output tensor data to be written to specified shared memory region.
- Return
Error object indicating success or failure of the request.
- Parameters
region_name
: The name of the shared memory region.byte_size
: The size of data in bytes.offset
: The offset in shared memory region. Default value is 0.
Clears the shared memory option set by the last call to InferRequestedOutput::SetSharedMemory().
After call to this function requested output will no longer be returned in a shared memory region.
- Return
Error object indicating success or failure of the request.
- Return
true if this output is being returned in shared memory.
Get information about the shared memory being used for this output.
- Return
Error object indicating success or failure.
- Parameters
name
: Returns the name of the shared memory region.byte_size
: Returns the size, in bytes, of the shared memory region.offset
: Returns the offset within the shared memory region.
Public Static Functions
-
static Error
Create
(InferRequestedOutput **infer_output, const std::string &name, const size_t class_count = 0)¶ Create a InferRequestedOutput instance that describes a model output being requested.
- Return
Error object indicating success or failure.
- Parameters
infer_output
: Returns a new InferOutputGrpc object.name
: The name of output being requested.class_count
: The number of classifications to be requested. The default value is 0 which means the classification results are not requested.
-
const std::string &