Struct MultiInferenceNLPMessageInterfaceProxy

Base Type

struct MultiInferenceNLPMessageInterfaceProxy : public morpheus::MultiInferenceMessageInterfaceProxy

Interface proxy, used to insulate python bindings.

Public Static Functions

static std::shared_ptr<MultiInferenceNLPMessage> init(std::shared_ptr<MessageMeta> meta, TensorIndex mess_offset, TensorIndex mess_count, std::shared_ptr<TensorMemory> memory, TensorIndex offset, TensorIndex count, std::string id_tensor_name)

Create and initialize a MultiInferenceNLPMessage, and return a shared pointer to the result.

Parameters
  • meta – Holds a data table, in practice a cudf DataFrame, with the ability to return both Python and C++ representations of the table

  • mess_offset – Offset into the metadata batch

  • mess_count – Messages count

  • memory – Holds the generic tensor data in cupy arrays that will be used for inference stages

  • offset – Message offset in inference memory object

  • count – Message count in inference memory object

  • id_tensor_name – Name of the tensor that correlates tensor rows to message IDs

Returns

std::shared_ptr<MultiInferenceNLPMessage>

static pybind11::object input_ids(MultiInferenceNLPMessage &self)

Get ‘input_ids’ tensor as a python object.

Parameters

self

Throws

pybind11::attribute_error – When no tensor named “input_ids” exists.

Returns

pybind11::object

static pybind11::object input_mask(MultiInferenceNLPMessage &self)

Get ‘input_mask’ tensor as a python object.

Parameters

self

Throws

pybind11::attribute_error – When no tensor named “input_mask” exists.

Returns

pybind11::object

static pybind11::object seq_ids(MultiInferenceNLPMessage &self)

Get ‘seq_ids’ tensor as a python object.

Parameters

self

Throws

pybind11::attribute_error – When no tensor named “seq_ids” exists.

Returns

pybind11::object

© Copyright 2023, NVIDIA. Last updated on Apr 11, 2023.