Class MultiInferenceFILMessage

Base Type

class MultiInferenceFILMessage : public morpheus::MultiInferenceMessage

A stronger typed version of MultiInferenceMessage that is used for FIL workloads. Helps ensure the proper inputs are set and eases debugging.

Public Functions

MultiInferenceFILMessage(std::shared_ptr<morpheus::MessageMeta> meta, size_t mess_offset, size_t mess_count, std::shared_ptr<morpheus::InferenceMemory> memory, size_t offset, size_t count)

Construct a new Multi Inference FIL Message object.

Parameters
  • meta – Holds a data table, in practice a cudf DataFrame, with the ability to return both Python and C++ representations of the table

  • mess_offset – Offset into the metadata batch

  • mess_count – Messages count

  • memory – Holds the generic tensor data in cupy arrays that will be used for inference stages

  • offset – Message offset in inference memory object

  • count – Message count in inference memory object

const TensorObject get_input__0() const

Returns the ‘input__0’ tensor, throws a std::runtime_error if it does not exist.

Parameters

name

Returns

const TensorObject

void set_input__0(const TensorObject &input__0)

Sets a tensor named ‘input__0’.

Parameters

input__0

const TensorObject get_seq_ids() const

Returns the ‘seq_ids’ tensor, throws a std::runtime_error if it does not exist.

Parameters

name

Returns

const TensorObject

void set_seq_ids(const TensorObject &seq_ids)

Sets a tensor named ‘seq_ids’.

Parameters

seq_ids

© Copyright 2023, NVIDIA. Last updated on Feb 3, 2023.