What can I help you with?
NVIDIA Morpheus (25.02.01)

morpheus.stages.output.http_client_sink_stage.HttpClientSinkStage

class HttpClientSinkStage(c, base_url, endpoint, static_endpoint=True, headers=None, query_params=None, method=HTTPMethod.POST, error_sleep_time=0.1, respect_retry_after_header=True, request_timeout_secs=30, accept_status_codes=(<HTTPStatus.OK: 200>, <HTTPStatus.CREATED: 201>, <HTTPStatus.ACCEPTED: 202>), max_retries=10, max_rows_per_payload=10000, lines=False, df_to_request_kwargs_fn=None, **request_kwargs)[source]

Bases: morpheus.pipeline.execution_mode_mixins.GpuAndCpuMixin, morpheus.pipeline.pass_thru_type_mixin.PassThruTypeMixin, morpheus.pipeline.single_port_stage.SinglePortStage

Write all messages to an HTTP endpoint.

Parameters
cmorpheus.config.Config

Pipeline configuration instance.

base_urlstr

Server base url, should include the intended protocol (e.g. http:// or https://) and port if necessary. This may or may not include a base path from which endpoint will be appended. examples: * “https://nvcr.io/” * “http://localhost:8080/base/path

endpointstr

Endpoint to which messages will be sent. This will be appended to base_url and may include a query string. The primary difference between endpoint and base_url is that endpoint may contain named format strings, when static_endpoint is False, and thus could potentially be different for each request.

Format strings which will be replaced with the corresponding column value from the first row of the incoming dataframe, if no such column exists a ValueError will be raised. When endpoint contains query a query string this has the potential of allowing for the values of the query string to be different for each request. When query_params is not None the values in query_params will be appended to the query string. This could potentially result in duplicate keys in the query string, some servers support this transforming duplicate keys into an array of values (ex “?t=1&t=2” => “t=[1,2]”), others do not.

Note: When max_rows_per_payload=1, this has the effect of producing a separate request for each row in the dataframe potentially using a unique endpoint for each request.

If additional customizations are required, df_to_request_kwargs_fn can be used to perform additional customizations of the request.

examples: * “api/v1/endpoint” * “api/v1/endpoint?time={timestamp}&id={id}” * “/{model_name}/{user}?time={timestamp}”

static_endpointbool, default True

Setting this to True indicates that the value of endpoint does not change between requests, and can be an optimization.

headersdict, optional

Optional set of headers to include in the request. If None the header value will be inferred based on lines. * {"Content-Type": "text/plain"} when lines is True * {"Content-Type": "application/json"} when lines is False

query_paramsdict, optional

Optional set of query parameters to include in the request.

methodmorpheus.utils.http_utils.HTTPMethod, optional, case_sensitive = False

HTTP method to use when sending messages, by default “POST”. Currently only “POST”, “PUT” and “PATCH” are supported.

error_sleep_timefloat, optional

Amount of time in seconds to sleep after the client receives an error. The client will perform an exponential backoff starting at error_sleep_time. Setting this to 0 causes the client to retry the request as fast as possible. If the server sets a Retry-After header and respect_retry_after_header is True, then that value will take precedence over error_sleep_time.

respect_retry_after_headerbool, optional

If True, the client will respect the Retry-After header if it is set by the server. If False, the client will perform an exponential backoff starting at error_sleep_time.

request_timeout_secsint, optional

Number of seconds to wait for the server to send data before giving up and raising an exception.

accept_status_codestyping.Iterable[int], optional, multiple = True

List of acceptable status codes, by default (200, 201, 202).

max_retriesint, default 10

Maximum number of times to retry the request fails, receives a redirect or returns a status in the retry_status_codes list. Setting this to 0 disables this feature, and setting this to a negative number will raise a ValueError.

max_rows_per_payloadint, optional

Maximum number of rows to include in a single payload, by default 10000. Setting this to 1 will send each row as a separate request.

linesbool, default False

If False, dataframes will be serialized to a JSON array of objects. If True, then the dataframes will be serialized to a string JSON objects separated by end-of-line characters.

df_to_request_kwargs_fn: typing.Callable[[str, str, DataFrameType], dict], optional

Optional function to perform additional customizations of the request. This function will be called for each DataFrame (according to max_rows_per_payload) before the request is sent. The function will be called with the following arguments: * base_url : str * endpoint : str * df : DataFrameType

The function should return a dict containing any keyword argument expected by the requests.Session.request function: https://requests.readthedocs.io/en/v2.9.1/api/#requests.Session.request

Specifically, this function is responsible for serializing the DataFrame to either a POST/PUT body or a query string. This method has the potential of returning a value for url overriding the value of endpoint and base_url, even when static_endpoint is True.

**request_kwargsdict

Additional arguments to pass to the requests.Session.request function. These values will are potentially overridden by the results of df_to_request_kwargs_fn if it is not None, otherwise the value of data will be overwritten, as will url when static_endpoint is False.

Attributes
df_type_str

Returns the DataFrame module that should be used for the given execution mode.

has_multi_input_ports

Indicates if this stage has multiple input ports.

has_multi_output_ports

Indicates if this stage has multiple output ports.

input_ports

Input ports to this stage.

is_built

Indicates if this stage has been built.

is_pre_built

Indicates if this stage has been built.

name

Unique name of the stage.

output_ports

Output ports from this stage.

unique_name

Unique name of stage.

Methods

accepted_types() Returns accepted input types for this stage.
build(builder[, do_propagate]) Build this stage.
can_build([check_ports]) Determines if all inputs have been built allowing this node to be built.
can_pre_build([check_ports]) Determines if all inputs have been built allowing this node to be built.
get_all_input_stages() Get all input stages to this stage.
get_all_inputs() Get all input senders to this stage.
get_all_output_stages() Get all output stages from this stage.
get_all_outputs() Get all output receivers from this stage.
get_df_class() Returns the DataFrame class that should be used for the given execution mode.
get_df_pkg() Returns the DataFrame package that should be used for the given execution mode.
get_needed_columns() Stages which need to have columns inserted into the dataframe, should populate the self._needed_columns dictionary with mapping of column names to morpheus.common.TypeId.
join() Awaitable method that stages can implement this to perform cleanup steps when pipeline is stopped.
start_async() This function is called along with on_start during stage initialization.
stop() Stages can implement this to perform cleanup steps when pipeline is stopped.
supported_execution_modes() Returns a tuple of supported execution modes of this stage.
supports_cpp_node() Indicates whether this stage supports CPP nodes.

compute_schema

_build(builder, input_nodes)[source]

This function is responsible for constructing this stage’s internal mrc.SegmentObject object. The input of this function contains the returned value from the upstream stage.

The input values are the mrc.Builder for this stage and a list of parent nodes.

Parameters
buildermrc.Builder

mrc.Builder object for the pipeline. This should be used to construct/attach the internal mrc.SegmentObject.

input_nodeslist[mrc.SegmentObject]

List containing the input mrc.SegmentObject objects.

Returns
list[mrc.SegmentObject]

List of tuples containing the output mrc.SegmentObject object from this stage.

accepted_types()[source]

Returns accepted input types for this stage.

Returns
typing.Tuple(morpheus.pipeline.messages.MessageMeta, )

Accepted input types.

build(builder, do_propagate=True)[source]

Build this stage.

Parameters
buildermrc.Builder

MRC segment for this stage.

do_propagatebool, optional

Whether to propagate to build output stages, by default True.

can_build(check_ports=False)[source]

Determines if all inputs have been built allowing this node to be built.

Parameters
check_portsbool, optional

Check if we can build based on the input ports, by default False.

Returns
bool

True if we can build, False otherwise.

can_pre_build(check_ports=False)[source]

Determines if all inputs have been built allowing this node to be built.

Parameters
check_portsbool, optional

Check if we can build based on the input ports, by default False.

Returns
bool

True if we can build, False otherwise.

compute_schema(schema)[source]

Compute the schema for this stage based on the incoming schema from upstream stages.

Incoming schema and type information from upstream stages is available via the schema.input_schemas and schema.input_types properties.

Derived classes need to override this method, can set the output type(s) on schema by calling set_type for all output ports. For example a simple pass-thru stage might perform the following:

Copy
Copied!
            

>>> for (port_idx, port_schema) in enumerate(schema.input_schemas): ... schema.output_schemas[port_idx].set_type(port_schema.get_type()) >>>

If the port types in upstream_schema are incompatible the stage should raise a RuntimeError.

property df_type_str: Literal['cudf', 'pandas']

Returns the DataFrame module that should be used for the given execution mode.

get_all_input_stages()[source]

Get all input stages to this stage.

Returns
list[morpheus.pipeline.pipeline.StageBase]

All input stages.

get_all_inputs()[source]

Get all input senders to this stage.

Returns
list[morpheus.pipeline.pipeline.Sender]

All input senders.

get_all_output_stages()[source]

Get all output stages from this stage.

Returns
list[morpheus.pipeline.pipeline.StageBase]

All output stages.

get_all_outputs()[source]

Get all output receivers from this stage.

Returns
list[morpheus.pipeline.pipeline.Receiver]

All output receivers.

get_df_class()[source]

Returns the DataFrame class that should be used for the given execution mode.

get_df_pkg()[source]

Returns the DataFrame package that should be used for the given execution mode.

get_needed_columns()[source]

Stages which need to have columns inserted into the dataframe, should populate the self._needed_columns dictionary with mapping of column names to morpheus.common.TypeId. This will ensure that the columns are allocated and populated with null values.

property has_multi_input_ports: bool

Indicates if this stage has multiple input ports.

Returns
bool

True if stage has multiple input ports, False otherwise.

property has_multi_output_ports: bool

Indicates if this stage has multiple output ports.

Returns
bool

True if stage has multiple output ports, False otherwise.

property input_ports: list[morpheus.pipeline.receiver.Receiver]

Input ports to this stage.

Returns
list[morpheus.pipeline.pipeline.Receiver]

Input ports to this stage.

property is_built: bool

Indicates if this stage has been built.

Returns
bool

True if stage is built, False otherwise.

property is_pre_built: bool

Indicates if this stage has been built.

Returns
bool

True if stage is built, False otherwise.

async join()[source]

Awaitable method that stages can implement this to perform cleanup steps when pipeline is stopped. Typically this is called after stop during a graceful shutdown, but may not be called if the pipeline is terminated.

property name: str

Unique name of the stage.

property output_ports: list[morpheus.pipeline.sender.Sender]

Output ports from this stage.

Returns
list[morpheus.pipeline.pipeline.Sender]

Output ports from this stage.

async start_async()[source]

This function is called along with on_start during stage initialization. Allows stages to utilize the asyncio loop if needed.

stop()[source]

Stages can implement this to perform cleanup steps when pipeline is stopped.

supported_execution_modes()[source]

Returns a tuple of supported execution modes of this stage.

supports_cpp_node()[source]

Indicates whether this stage supports CPP nodes.

property unique_name: str

Unique name of stage. Generated by appending stage id to stage name.

Returns
str

Unique name of stage.

Previous morpheus.stages.output.http_client_sink_stage
Next morpheus.stages.output.http_server_sink_stage
© Copyright 2024, NVIDIA. Last updated on Mar 3, 2025.