(Latest Version)

Struct MultiInferenceMessageInterfaceProxy

Base Type

Derived Types

struct MultiInferenceMessageInterfaceProxy : public morpheus::MultiTensorMessageInterfaceProxy

Interface proxy, used to insulate python bindings.

Subclassed by morpheus::MultiInferenceFILMessageInterfaceProxy, morpheus::MultiInferenceNLPMessageInterfaceProxy

Public Static Functions

static std::shared_ptr<MultiInferenceMessage> init(std::shared_ptr<MessageMeta> meta, TensorIndex mess_offset, TensorIndex mess_count, std::shared_ptr<TensorMemory> memory, TensorIndex offset, TensorIndex count, std::string id_tensor_name)

Create and initialize a MultiInferenceMessage object, and return a shared pointer to the result.

Parameters
  • meta – Holds a data table, in practice a cudf DataFrame, with the ability to return both Python and C++ representations of the table

  • mess_offset – Offset into the metadata batch

  • mess_count – Messages count

  • memory – Holds the generic tensor data in cupy arrays that will be used for inference stages

  • offset – Message offset in inference memory instance

  • count – Message count in inference memory instance

  • id_tensor_name – Name of the tensor that correlates tensor rows to message IDs

Returns

std::shared_ptr<MultiInferenceMessage>

© Copyright 2023, NVIDIA. Last updated on Apr 11, 2023.