morpheus.stages.output.http_client_sink_stage.HttpClientSinkStage#
- class HttpClientSinkStage(
- c,
- base_url,
- endpoint,
- static_endpoint=True,
- headers=None,
- query_params=None,
- method=HTTPMethod.POST,
- error_sleep_time=0.1,
- respect_retry_after_header=True,
- request_timeout_secs=30,
- accept_status_codes=(HTTPStatus.OK, HTTPStatus.CREATED, HTTPStatus.ACCEPTED),
- max_retries=10,
- max_rows_per_payload=10000,
- lines=False,
- df_to_request_kwargs_fn=None,
- **request_kwargs,
Bases:
GpuAndCpuMixin,PassThruTypeMixin,SinglePortStageWrite all messages to an HTTP endpoint.
- Parameters:
- c
morpheus.config.Config Pipeline configuration instance.
- base_urlstr
Server base url, should include the intended protocol (e.g. http:// or https://) and port if necessary. This may or may not include a base path from which
endpointwill be appended. examples: * “https://nvcr.io/” * “http://localhost:8080/base/path”- endpointstr
Endpoint to which messages will be sent. This will be appended to
base_urland may include a query string. The primary difference betweenendpointandbase_urlis thatendpointmay contain named format strings, whenstatic_endpointisFalse, and thus could potentially be different for each request.Format strings which will be replaced with the corresponding column value from the first row of the incoming dataframe, if no such column exists a
ValueErrorwill be raised. Whenendpointcontains query a query string this has the potential of allowing for the values of the query string to be different for each request. Whenquery_paramsis notNonethe values inquery_paramswill be appended to the query string. This could potentially result in duplicate keys in the query string, some servers support this transforming duplicate keys into an array of values (ex “?t=1&t=2” => “t=[1,2]”), others do not.Note: When
max_rows_per_payload=1, this has the effect of producing a separate request for each row in the dataframe potentially using a unique endpoint for each request.If additional customizations are required,
df_to_request_kwargs_fncan be used to perform additional customizations of the request.examples: * “api/v1/endpoint” * “api/v1/endpoint?time={timestamp}&id={id}” * “/{model_name}/{user}?time={timestamp}”
- static_endpointbool, default True
Setting this to
Trueindicates that the value ofendpointdoes not change between requests, and can be an optimization.- headersdict, optional
Optional set of headers to include in the request. If
Nonethe header value will be inferred based onlines. *{"Content-Type": "text/plain"}whenlinesisTrue*{"Content-Type": "application/json"}whenlinesisFalse- query_paramsdict, optional
Optional set of query parameters to include in the request.
- method
morpheus.utils.http_utils.HTTPMethod, optional, case_sensitive = False HTTP method to use when sending messages, by default “POST”. Currently only “POST”, “PUT” and “PATCH” are supported.
- error_sleep_timefloat, optional
Amount of time in seconds to sleep after the client receives an error. The client will perform an exponential backoff starting at
error_sleep_time. Setting this to 0 causes the client to retry the request as fast as possible. If the server sets aRetry-Afterheader andrespect_retry_after_headerisTrue, then that value will take precedence overerror_sleep_time.- respect_retry_after_headerbool, optional
If True, the client will respect the
Retry-Afterheader if it is set by the server. If False, the client will perform an exponential backoff starting aterror_sleep_time.- request_timeout_secsint, optional
Number of seconds to wait for the server to send data before giving up and raising an exception.
- accept_status_codestyping.Iterable[int], optional, multiple = True
List of acceptable status codes, by default (200, 201, 202).
- max_retriesint, default 10
Maximum number of times to retry the request fails, receives a redirect or returns a status in the
retry_status_codeslist. Setting this to 0 disables this feature, and setting this to a negative number will raise aValueError.- max_rows_per_payloadint, optional
Maximum number of rows to include in a single payload, by default 10000. Setting this to 1 will send each row as a separate request.
- linesbool, default False
If False, dataframes will be serialized to a JSON array of objects. If True, then the dataframes will be serialized to a string JSON objects separated by end-of-line characters.
- df_to_request_kwargs_fn: typing.Callable[[str, str, DataFrameType], dict], optional
Optional function to perform additional customizations of the request. This function will be called for each DataFrame (according to
max_rows_per_payload) before the request is sent. The function will be called with the following arguments: *base_url: str *endpoint: str *df: DataFrameTypeThe function should return a dict containing any keyword argument expected by the
requests.Session.requestfunction: https://requests.readthedocs.io/en/v2.9.1/api/#requests.Session.requestSpecifically, this function is responsible for serializing the DataFrame to either a POST/PUT body or a query string. This method has the potential of returning a value for
urloverriding the value ofendpointandbase_url, even whenstatic_endpointis True.- **request_kwargsdict
Additional arguments to pass to the
requests.Session.requestfunction. These values will are potentially overridden by the results ofdf_to_request_kwargs_fnif it is notNone, otherwise the value ofdatawill be overwritten, as willurlwhenstatic_endpointis False.
- c
- Attributes:
df_type_strReturns the DataFrame module that should be used for the given execution mode.
has_multi_input_portsIndicates if this stage has multiple input ports.
has_multi_output_portsIndicates if this stage has multiple output ports.
input_portsInput ports to this stage.
is_builtIndicates if this stage has been built.
is_pre_builtIndicates if this stage has been built.
nameUnique name of the stage.
output_portsOutput ports from this stage.
unique_nameUnique name of stage.
Methods
Returns accepted input types for this stage.
build(builder[, do_propagate])Build this stage.
can_build([check_ports])Determines if all inputs have been built allowing this node to be built.
can_pre_build([check_ports])Determines if all inputs have been built allowing this node to be built.
Get all input stages to this stage.
Get all input senders to this stage.
Get all output stages from this stage.
Get all output receivers from this stage.
Returns the DataFrame class that should be used for the given execution mode.
Returns the DataFrame package that should be used for the given execution mode.
Stages which need to have columns inserted into the dataframe, should populate the
self._needed_columnsdictionary with mapping of column names tomorpheus.common.TypeId.join()Awaitable method that stages can implement this to perform cleanup steps when pipeline is stopped.
This function is called along with on_start during stage initialization.
stop()Stages can implement this to perform cleanup steps when pipeline is stopped.
Returns a tuple of supported execution modes of this stage.
Indicates whether this stage supports CPP nodes.
compute_schema
- _build(builder, input_nodes)[source]#
This function is responsible for constructing this stage’s internal
mrc.SegmentObjectobject. The input of this function contains the returned value from the upstream stage.The input values are the
mrc.Builderfor this stage and a list of parent nodes.- Parameters:
- builder
mrc.Builder mrc.Builderobject for the pipeline. This should be used to construct/attach the internalmrc.SegmentObject.- input_nodes
list[mrc.SegmentObject] List containing the input
mrc.SegmentObjectobjects.
- builder
- Returns:
list[mrc.SegmentObject]List of tuples containing the output
mrc.SegmentObjectobject from this stage.
- accepted_types()[source]#
Returns accepted input types for this stage.
- Returns:
- typing.Tuple(
morpheus.pipeline.messages.MessageMeta, ) Accepted input types.
- typing.Tuple(
- build(builder, do_propagate=True)[source]#
Build this stage.
- Parameters:
- builder
mrc.Builder MRC segment for this stage.
- do_propagatebool, optional
Whether to propagate to build output stages, by default True.
- builder
- can_build(check_ports=False)[source]#
Determines if all inputs have been built allowing this node to be built.
- Parameters:
- check_portsbool, optional
Check if we can build based on the input ports, by default False.
- Returns:
- bool
True if we can build, False otherwise.
- can_pre_build(check_ports=False)[source]#
Determines if all inputs have been built allowing this node to be built.
- Parameters:
- check_portsbool, optional
Check if we can build based on the input ports, by default False.
- Returns:
- bool
True if we can build, False otherwise.
- compute_schema(schema)[source]#
Compute the schema for this stage based on the incoming schema from upstream stages.
Incoming schema and type information from upstream stages is available via the
schema.input_schemasandschema.input_typesproperties.Derived classes need to override this method, can set the output type(s) on
schemaby callingset_typefor all output ports. For example a simple pass-thru stage might perform the following:>>> for (port_idx, port_schema) in enumerate(schema.input_schemas): ... schema.output_schemas[port_idx].set_type(port_schema.get_type()) >>>
If the port types in
upstream_schemaare incompatible the stage should raise aRuntimeError.
- property df_type_str: Literal['cudf', 'pandas']#
Returns the DataFrame module that should be used for the given execution mode.
- get_all_input_stages()[source]#
Get all input stages to this stage.
- Returns:
- list[
morpheus.pipeline.pipeline.StageBase] All input stages.
- list[
- get_all_inputs()[source]#
Get all input senders to this stage.
- Returns:
- list[
morpheus.pipeline.pipeline.Sender] All input senders.
- list[
- get_all_output_stages()[source]#
Get all output stages from this stage.
- Returns:
- list[
morpheus.pipeline.pipeline.StageBase] All output stages.
- list[
- get_all_outputs()[source]#
Get all output receivers from this stage.
- Returns:
- list[
morpheus.pipeline.pipeline.Receiver] All output receivers.
- list[
- get_df_class()[source]#
Returns the DataFrame class that should be used for the given execution mode.
- get_df_pkg()[source]#
Returns the DataFrame package that should be used for the given execution mode.
- get_needed_columns()[source]#
Stages which need to have columns inserted into the dataframe, should populate the
self._needed_columnsdictionary with mapping of column names tomorpheus.common.TypeId. This will ensure that the columns are allocated and populated with null values.
- property has_multi_input_ports: bool#
Indicates if this stage has multiple input ports.
- Returns:
- bool
True if stage has multiple input ports, False otherwise.
- property has_multi_output_ports: bool#
Indicates if this stage has multiple output ports.
- Returns:
- bool
True if stage has multiple output ports, False otherwise.
- property input_ports: list[Receiver]#
Input ports to this stage.
- Returns:
- list[
morpheus.pipeline.pipeline.Receiver] Input ports to this stage.
- list[
- property is_built: bool#
Indicates if this stage has been built.
- Returns:
- bool
True if stage is built, False otherwise.
- property is_pre_built: bool#
Indicates if this stage has been built.
- Returns:
- bool
True if stage is built, False otherwise.
- async join()[source]#
Awaitable method that stages can implement this to perform cleanup steps when pipeline is stopped. Typically this is called after
stopduring a graceful shutdown, but may not be called if the pipeline is terminated.
- property output_ports: list[Sender]#
Output ports from this stage.
- Returns:
- list[
morpheus.pipeline.pipeline.Sender] Output ports from this stage.
- list[