Defined in File multi_inference.hpp
-
struct MultiInferenceMessageInterfaceProxy
Interface proxy, used to insulate python bindings.
Public Static Functions
Create and initialize a MultiInferenceMessage object, and return a shared pointer to the result.
- Parameters
meta – Holds a data table, in practice a cudf DataFrame, with the ability to return both Python and C++ representations of the table
mess_offset – Offset into the metadata batch
mess_count – Messages count
memory – Holds the generic tensor data in cupy arrays that will be used for inference stages
offset – Message offset in inference memory instance
count – Message count in inference memory instance
- Returns
std::shared_ptr<MultiInferenceMessage>
-
static std::shared_ptr<morpheus::InferenceMemory> memory(MultiInferenceMessage &self)
Get inference memory object shared pointer.
- Parameters
self –
- Returns
std::shared_ptr<morpheus::InferenceMemory>
-
static std::size_t offset(MultiInferenceMessage &self)
Get message offset.
- Parameters
self –
- Returns
std::size_t
-
static std::size_t count(MultiInferenceMessage &self)
Get messages count.
- Parameters
self –
- Returns
std::size_t
-
static pybind11::object get_input(MultiInferenceMessage &self, const std::string &name)
Get ‘input_id’ tensor as a python object, throws a
std::runtime_error
if it does not exist.- Parameters
self –
name –
- Returns
pybind11::object
-
static std::shared_ptr<MultiInferenceMessage> get_slice(MultiInferenceMessage &self, std::size_t start, std::size_t stop)
Get the shared pointer of a sliced batches based on offsets supplied. Automatically calculates the correct
mess_offset
andmess_count
- Parameters
self –
start – : Start offset address
stop – : Stop offset address
- Returns
std::shared_ptr<MultiInferenceMessage>