Defined in File multi_inference_nlp.hpp
-
struct MultiInferenceNLPMessageInterfaceProxy
Interface proxy, used to insulate python bindings.
Public Static Functions
Create and initialize a MultiInferenceNLPMessage, and return a shared pointer to the result.
- Parameters
meta – Holds a data table, in practice a cudf DataFrame, with the ability to return both Python and C++ representations of the table
mess_offset – Offset into the metadata batch
mess_count – Messages count
memory – Holds the generic tensor data in cupy arrays that will be used for inference stages
offset – Message offset in inference memory object
count – Message count in inference memory object
- Returns
std::shared_ptr<MultiInferenceNLPMessage>
-
static std::shared_ptr<morpheus::InferenceMemory> memory(MultiInferenceNLPMessage &self)
Get inference memory object shared pointer.
- Parameters
self –
- Returns
std::shared_ptr<morpheus::InferenceMemory>
-
static std::size_t offset(MultiInferenceNLPMessage &self)
Get message offset.
- Parameters
self –
- Returns
std::size_t
-
static std::size_t count(MultiInferenceNLPMessage &self)
Get messages count.
- Parameters
self –
- Returns
std::size_t
-
static pybind11::object input_ids(MultiInferenceNLPMessage &self)
Get ‘input_ids’ tensor as a python object, throws a
std::runtime_error
if it does not exist.- Parameters
self –
- Returns
pybind11::object
-
static pybind11::object input_mask(MultiInferenceNLPMessage &self)
Get ‘input_mask’ tensor as a python object, throws a
std::runtime_error
if it does not exist.- Parameters
self –
- Returns
pybind11::object
-
static pybind11::object seq_ids(MultiInferenceNLPMessage &self)
Get ‘seq_ids’ tensor as a python object, throws a
std::runtime_error
if it does not exist.- Parameters
self –
- Returns
pybind11::object