(Latest Version)

Class MultiInferenceMessage

Base Type

class MultiInferenceMessage : public morpheus::DerivedMultiMessage<MultiInferenceMessage, MultiTensorMessage>

This is a container class that holds a pointer to an instance of the TensorMemory container and the metadata of the data contained within it. Builds on top of the MultiInferenceMessage and MultiTensorMessage class to add additional data for inferencing.

Public Functions

MultiInferenceMessage(const MultiInferenceMessage &other) = default

Default copy constructor.

MultiInferenceMessage(std::shared_ptr<MessageMeta> meta, TensorIndex mess_offset = 0, TensorIndex mess_count = -1, std::shared_ptr<TensorMemory> memory = nullptr, TensorIndex offset = 0, TensorIndex count = -1, std::string id_tensor_name = "seq_ids")

Construct a new Multi Inference Message object.

Parameters
  • meta – Holds a data table, in practice a cudf DataFrame, with the ability to return both Python and C++ representations of the table

  • mess_offset – Offset into the metadata batch

  • mess_count – Messages count

  • memory – Holds the generic tensor data in cupy arrays that will be used for inference stages

  • offset – Message offset in inference memory instance

  • count – Message count in inference memory instance

  • id_tensor_name – Name of the tensor that correlates tensor rows to message IDs

const TensorObject get_input(const std::string &name) const

Returns the input tensor for the given name.

Parameters

name

Throws

std::runtime_error – If no tensor matching name exists

Returns

const TensorObject

TensorObject get_input(const std::string &name)

Returns the input tensor for the given name.

Parameters

name

Throws

std::runtime_error – If no tensor matching name exists

Returns

TensorObject

void set_input(const std::string &name, const TensorObject &value)

Update the value of ain input tensor. The tensor must already exist, otherwise this will halt on a fatal error.

© Copyright 2023, NVIDIA. Last updated on Apr 11, 2023.