Class MultiInferenceNLPMessage

Base Type

class MultiInferenceNLPMessage : public morpheus::MultiInferenceMessage

A stronger typed version of MultiInferenceMessage that is used for NLP workloads. Helps ensure the proper inputs are set and eases debugging.

Public Functions

MultiInferenceNLPMessage(std::shared_ptr<morpheus::MessageMeta> meta, std::size_t mess_offset, std::size_t mess_count, std::shared_ptr<morpheus::InferenceMemory> memory, std::size_t offset, std::size_t count)

Construct a new Multi Inference NLP Message object.

Parameters
  • meta – Holds a data table, in practice a cudf DataFrame, with the ability to return both Python and C++ representations of the table

  • mess_offset – Offset into the metadata batch

  • mess_count – Messages count

  • memory – Holds the generic tensor data in cupy arrays that will be used for inference stages

  • offset – Message offset in inference memory object

  • count – Message count in inference memory object

const TensorObject get_input_ids() const

Returns the ‘input_ids’ tensor, throws a std::runtime_error if it does not exist.

Parameters

name

Returns

const TensorObject

void set_input_ids(const TensorObject &input_ids)

Sets a tensor named ‘input_ids’.

Parameters

input_ids

const TensorObject get_input_mask() const

Returns the ‘input_mask’ tensor, throws a std::runtime_error if it does not exist.

Parameters

name

Returns

const TensorObject

void set_input_mask(const TensorObject &input_mask)

Sets a tensor named ‘input_mask’.

Parameters

input_mask

const TensorObject get_seq_ids() const

Returns the ‘seq_ids’ tensor, throws a std::runtime_error if it does not exist.

Parameters

name

Returns

const TensorObject

void set_seq_ids(const TensorObject &seq_ids)

Sets a tensor named ‘seq_ids’.

Parameters

seq_ids

© Copyright 2023, NVIDIA. Last updated on Feb 3, 2023.