Class MultiInferenceMessage

Base Type

Derived Types

class MultiInferenceMessage : public morpheus::DerivedMultiMessage<MultiInferenceMessage, MultiTensorMessage>

This is a container class that holds a pointer to an instance of the TensorMemory container and the metadata of the data contained within it. Builds on top of the MultiInferenceMessage and MultiTensorMessage class to add additional data for inferencing.

Subclassed by morpheus::MultiInferenceFILMessage, morpheus::MultiInferenceNLPMessage

Public Functions

MultiInferenceMessage(const MultiInferenceMessage &other) = default

Default copy constructor.

MultiInferenceMessage(std::shared_ptr<morpheus::MessageMeta> meta, std::size_t mess_offset, std::size_t mess_count, std::shared_ptr<morpheus::InferenceMemory> memory, std::size_t offset, std::size_t count)

Construct a new Multi Inference Message object.

Parameters
  • meta – Holds a data table, in practice a cudf DataFrame, with the ability to return both Python and C++ representations of the table

  • mess_offset – Offset into the metadata batch

  • mess_count – Messages count

  • memory – Holds the generic tensor data in cupy arrays that will be used for inference stages

  • offset – Message offset in inference memory instance

  • count – Message count in inference memory instance

const TensorObject get_input(const std::string &name) const

Returns the input tensor for the given name. Will halt on a fatal error if the tensor does not exist.

Parameters

name

Returns

const TensorObject

TensorObject get_input(const std::string &name)

Returns the input tensor for the given name. Will halt on a fatal error if the tensor does not exist.

Parameters

name

Returns

TensorObject

void set_input(const std::string &name, const TensorObject &value)

Update the value of ain input tensor. The tensor must already exist, otherwise this will halt on a fatal error.

© Copyright 2023, NVIDIA. Last updated on Feb 3, 2023.