Class MultiInferenceFILMessage

Base Type

class MultiInferenceFILMessage : public morpheus::DerivedMultiMessage<MultiInferenceFILMessage, MultiInferenceMessage>

A stronger typed version of MultiInferenceMessage that is used for FIL workloads. Helps ensure the proper inputs are set and eases debugging.

Public Functions

MultiInferenceFILMessage(std::shared_ptr<MessageMeta> meta, TensorIndex mess_offset = 0, TensorIndex mess_count = -1, std::shared_ptr<TensorMemory> memory = nullptr, TensorIndex offset = 0, TensorIndex count = -1, std::string id_tensor_name = "seq_ids")

Construct a new Multi Inference FIL Message object.

Parameters
  • meta – Holds a data table, in practice a cudf DataFrame, with the ability to return both Python and C++ representations of the table

  • mess_offset – Offset into the metadata batch

  • mess_count – Messages count

  • memory – Holds the generic tensor data in cupy arrays that will be used for inference stages

  • offset – Message offset in inference memory object

  • count – Message count in inference memory object

  • id_tensor_name – Name of the tensor that correlates tensor rows to message IDs

const TensorObject get_input__0() const

Returns the ‘input__0’ tensor, throws a std::runtime_error if it does not exist.

Parameters

name

Throws

std::runtime_error – If no tensor named “input__0” exists

Returns

const TensorObject

void set_input__0(const TensorObject &input__0)

Sets a tensor named ‘input__0’.

Parameters

input__0

const TensorObject get_seq_ids() const

Returns the ‘seq_ids’ tensor, throws a std::runtime_error if it does not exist.

Parameters

name

Throws

std::runtime_error – If no tensor named “seq_ids” exists

Returns

const TensorObject

void set_seq_ids(const TensorObject &seq_ids)

Sets a tensor named ‘seq_ids’.

Parameters

seq_ids

Previous Class MessageMeta
Next Class MultiInferenceMessage
© Copyright 2024, NVIDIA. Last updated on Apr 11, 2024.