morpheus.stages.inference.triton_inference_stage.TritonInferenceStage

class TritonInferenceStage(c, model_name, server_url, force_convert_inputs=False, use_shared_memory=False)[source]

Bases: morpheus.stages.inference.inference_stage.InferenceStage

Perform inference with Triton Inference Server.

This class specifies which inference implementation category (Ex: NLP/FIL) is needed for inferencing.

Parameters
c<bsp-code-inline code="<a href="morpheus.config.Config.html#morpheus.config.Config">morpheus.config.Config</a>"><a href="morpheus.config.Config.html#morpheus.config.Config">morpheus.config.Config</a></bsp-code-inline>

Pipeline configuration instance.

model_namestr

Name of the model specifies which model can handle the inference requests that are sent to Triton inference server.

server_urlstr

Triton server URL.

force_convert_inputsbool, default = False

Instructs the stage to convert the incoming data to the same format that Triton is expecting. If set to False, data will only be converted if it would not result in the loss of data.

use_shared_memorybool, default = False, is_flag = True

Whether or not to use CUDA Shared IPC Memory for transferring data to Triton. Using CUDA IPC reduces network transfer time but requires that Morpheus and Triton are located on the same machine.

Attributes
has_multi_input_ports

Indicates if this stage has multiple input ports.

has_multi_output_ports

Indicates if this stage has multiple output ports.

input_ports

Input ports to this stage.

is_built

Indicates if this stage has been built.

name

The name of the stage.

output_ports

Output ports from this stage.

unique_name

Unique name of stage.

Methods

accepted_types()

Accepted input types to this stage.

build(builder[, do_propagate])

Build this stage.

can_build([check_ports])

Determines if all inputs have been built allowing this node to be built.

get_all_input_stages()

Get all input stages to this stage.

get_all_inputs()

Get all input senders to this stage.

get_all_output_stages()

Get all output stages from this stage.

get_all_outputs()

Get all output receivers from this stage.

join()

On all inference worker threads, this function applies join.

on_start()

This function can be overridden to add usecase-specific implementation at the start of any stage in the pipeline.

start_async()

This function is called along with on_start during stage initialization.

stop()

Stops the inference workers and closes the inference queue.

supports_cpp_node()

Specifies whether this Stage is capable of creating C++ nodes.

_build(builder, in_ports_streams)[source]

This function is responsible for constructing this stage’s internal mrc.SegmentObject object. The input of this function contains the returned value from the upstream stage.

The input values are the mrc.Builder for this stage and a StreamPair tuple which contain the input mrc.SegmentObject object and the message data type.

Parameters
builder<bsp-code-inline code="mrc.Builder">mrc.Builder</bsp-code-inline>

mrc.Builder object for the pipeline. This should be used to construct/attach the internal mrc.SegmentObject.

in_ports_streams<bsp-code-inline code="morpheus.pipeline.pipeline.StreamPair">morpheus.pipeline.pipeline.StreamPair</bsp-code-inline>

List of tuples containing the input mrc.SegmentObject object and the message data type.

Returns
typing.List[morpheus.pipeline.pipeline.StreamPair]

List of tuples containing the output mrc.SegmentObject object from this stage and the message data type.

_get_inference_worker(inf_queue)[source]

Returns the main inference worker which manages requests possibly in another thread depending on which mode the pipeline is currently operating in.

Parameters
inf_queue<bsp-code-inline code="<a href="morpheus.utils.producer_consumer_queue.ProducerConsumerQueue.html#morpheus.utils.producer_consumer_queue.ProducerConsumerQueue">morpheus.utils.producer_consumer_queue.ProducerConsumerQueue</a>"><a href="morpheus.utils.producer_consumer_queue.ProducerConsumerQueue.html#morpheus.utils.producer_consumer_queue.ProducerConsumerQueue">morpheus.utils.producer_consumer_queue.ProducerConsumerQueue</a></bsp-code-inline>

Inference request queue.

Returns
InferenceWorker

Inference worker implementation for stage.

accepted_types()[source]

Accepted input types to this stage.

Returns
typing.Tuple

Tuple of input types.

build(builder, do_propagate=True)[source]

Build this stage.

Parameters
builder<bsp-code-inline code="mrc.Builder">mrc.Builder</bsp-code-inline>

MRC segment for this stage.

do_propagatebool, optional

Whether to propagate to build output stages, by default True.

can_build(check_ports=False)[source]

Determines if all inputs have been built allowing this node to be built.

Parameters
check_portsbool, optional

Check if we can build based on the input ports, by default False.

Returns
bool

True if we can build, False otherwise.

get_all_input_stages()[source]

Get all input stages to this stage.

Returns
typing.List[morpheus.pipeline.pipeline.StreamWrapper]

All input stages.

get_all_inputs()[source]

Get all input senders to this stage.

Returns
typing.List[morpheus.pipeline.pipeline.Sender]

All input senders.

get_all_output_stages()[source]

Get all output stages from this stage.

Returns
typing.List[morpheus.pipeline.pipeline.StreamWrapper]

All output stages.

get_all_outputs()[source]

Get all output receivers from this stage.

Returns
typing.List[morpheus.pipeline.pipeline.Receiver]

All output receivers.

property has_multi_input_ports: bool

Indicates if this stage has multiple input ports.

Returns
bool

True if stage has multiple input ports, False otherwise.

property has_multi_output_ports: bool

Indicates if this stage has multiple output ports.

Returns
bool

True if stage has multiple output ports, False otherwise.

property input_ports: List[morpheus.pipeline.receiver.Receiver]

Input ports to this stage.

Returns
typing.List[morpheus.pipeline.pipeline.Receiver]

Input ports to this stage.

property is_built: bool

Indicates if this stage has been built.

Returns
bool

True if stage is built, False otherwise.

async join()[source]

On all inference worker threads, this function applies join.

property name: str

The name of the stage. Used in logging. Each derived class should override this property with a unique name.

Returns
str

Name of a stage.

on_start()[source]

This function can be overridden to add usecase-specific implementation at the start of any stage in the pipeline.

property output_ports: List[morpheus.pipeline.sender.Sender]

Output ports from this stage.

Returns
typing.List[morpheus.pipeline.pipeline.Sender]

Output ports from this stage.

async start_async()[source]

This function is called along with on_start during stage initialization. Allows stages to utilize the asyncio loop if needed.

stop()[source]

Stops the inference workers and closes the inference queue.

supports_cpp_node()[source]

Specifies whether this Stage is capable of creating C++ nodes. During the build phase, this value will be combined with CppConfig.get_should_use_cpp() to determine whether or not a C++ node is created. This is an instance method to allow runtime decisions and derived classes to override base implementations.

property unique_name: str

Unique name of stage. Generated by appending stage id to stage name.

Returns
str

Unique name of stage.

© Copyright 2023, NVIDIA. Last updated on Feb 3, 2023.