morpheus.stages.input.http_client_source_stage.HttpClientSourceStage
- class HttpClientSourceStage(config, url, query_params=None, headers=None, method=HTTPMethod.GET, sleep_time=0.1, error_sleep_time=0.1, respect_retry_after_header=True, request_timeout_secs=30, accept_status_codes=(<HTTPStatus.OK: 200>, ), max_retries=10, lines=False, stop_after=0, payload_to_df_fn=None, message_type=SupportedMessageTypes.MESSAGE_META, task_type=None, task_payload=None, **request_kwargs)[source]
Bases:
morpheus.pipeline.execution_mode_mixins.GpuAndCpuMixin
,morpheus.pipeline.preallocator_mixin.PreallocatorMixin
,morpheus.pipeline.configurable_output_source.ConfigurableOutputSource
Source stage that polls a remote HTTP server for incoming data.
- Parameters
- config
morpheus.config.Config
Pipeline configuration instance.
- urlstr
Remote URL to poll for data, ex
https://catalog.ngc.nvidia.com/api/collections
. This should include protocol prefix (ex. “http://”, “https://”) and port if necessary. If the protocol is omitted,http://
will be used.- query_paramsdict, callable, default None
Query parameters to pass to the remote URL. Can either be a dictionary of key-value pairs or a callable that returns a dictionary of key-value pairs. If a callable is provided, it will be called with no arguments.
- headers: dict, default None
Headers sent with the request.
- method
morpheus.utils.http_utils.HTTPMethod
, optional, case_sensitive = False HTTP method to use.
- sleep_timefloat, default 0.1
Amount of time in seconds to sleep between successive requests. Setting this to 0 disables this feature.
- error_sleep_timefloat, default 0.1
Amount of time in seconds to sleep after the client receives an error. The client will perform an exponential backoff starting at
error_sleep_time
. Setting this to 0 causes the client to retry the request as fast as possible. If the server sets aRetry-After
header andrespect_retry_after_header
isTrue
, then that value will take precedence overerror_sleep_time
.- respect_retry_after_header: bool, default True
If True, the client will respect the
Retry-After
header if it is set by the server. If False, the client will perform an exponential backoff starting aterror_sleep_time
.- request_timeout_secsint, optional
Number of seconds to wait for the server to send data before giving up and raising an exception.
- max_errorsint, default 10
Maximum number of consequtive errors to receive before raising an error.
- accept_status_codestyping.Iterable[int], optional, multiple = True
List of status codes to accept. If the response status code is not in this collection, then the request will be considered an error
- max_retriesint, default 10
Maximum number of times to retry the request fails, receives a redirect or returns a status in the
retry_status_codes
list. Setting this to 0 disables this feature, and setting this to a negative number will raise aValueError
.- linesbool, default False
If False, the response payloads are expected to be a JSON array of objects. If True, the payloads are expected to contain a JSON objects separated by end-of-line characters.
- stop_afterint, default 0
Stops ingesting after emitting
stop_after
records (rows in the dataframe). Useful for testing. Disabled if0
- payload_to_df_fncallable, default None
A callable that takes the HTTP payload bytes as the first argument and the
lines
parameter is passed in as the second argument and returns a DataFrame. If unsetcudf.read_json
is used in GPU mode andpandas.read_json
in CPU mode.- message_type
SupportedMessageTypes
, case_sensitive = False The type of message to emit.
- task_typestr, default = None
If specified, adds the specified task to the
ControlMessage
. This parameter is only valid whenmessage_type
is set toCONTROL_MESSAGE
. If notNone
,task_payload
must also be specified.- task_payloaddict, default = None
If specified, adds the specified task to the
ControlMessage
. This parameter is only valid whenmessage_type
is set toCONTROL_MESSAGE
. If notNone
,task_type
must also be specified.- **request_kwargsdict
Additional arguments to pass to the
requests.request
function.
- config
- Attributes
df_type_str
Returns the DataFrame module that should be used for the given execution mode.
has_multi_input_ports
Indicates if this stage has multiple input ports.
has_multi_output_ports
Indicates if this stage has multiple output ports.
input_count
Return None for no max intput count.
input_ports
Input ports to this stage.
is_built
Indicates if this stage has been built.
is_pre_built
Indicates if this stage has been built.
name
Unique name of the stage
output_ports
Output ports from this stage.
unique_name
Unique name of stage.
Methods
build
(builder[, do_propagate])Build this stage. can_build
([check_ports])Determines if all inputs have been built allowing this node to be built. can_pre_build
([check_ports])Determines if all inputs have been built allowing this node to be built. compute_schema
(schema)Compute the schema for this stage based on the incoming schema from upstream stages. get_all_input_stages
()Get all input stages to this stage. get_all_inputs
()Get all input senders to this stage. get_all_output_stages
()Get all output stages from this stage. get_all_outputs
()Get all output receivers from this stage. get_df_class
()Returns the DataFrame class that should be used for the given execution mode. get_df_pkg
()Returns the DataFrame package that should be used for the given execution mode. get_needed_columns
()Stages which need to have columns inserted into the dataframe, should populate the self._needed_columns
dictionary with mapping of column names tomorpheus.common.TypeId
.is_stop_requested
()Returns True
if a stop has been requested.join
()Awaitable method that stages can implement this to perform cleanup steps when pipeline is stopped. request_stop
()Request the source to stop processing data. set_needed_columns
(needed_columns)Sets the columns needed to perform preallocation. start_async
()This function is called along with on_start during stage initialization. stop
()This method is invoked by the pipeline whenever there is an unexpected shutdown. supported_execution_modes
()Returns a tuple of supported execution modes of this stage. supports_cpp_node
()Indicates whether or not this stage supports a C++ implementation - _build(builder, input_nodes)[source]
This function is responsible for constructing this stage’s internal
mrc.SegmentObject
object. The input of this function contains the returned value from the upstream stage.The input values are the
mrc.Builder
for this stage and a list of parent nodes.- Parameters
- builder
mrc.Builder
mrc.Builder
object for the pipeline. This should be used to construct/attach the internalmrc.SegmentObject
.- input_nodes
list[mrc.SegmentObject]
List containing the input
mrc.SegmentObject
objects.
- builder
- Returns
list[mrc.SegmentObject]
List of tuples containing the output
mrc.SegmentObject
object from this stage.
- _build_source(builder)[source]
Abstract method all derived Source classes should implement. Returns the same value as
build
.- Returns
mrc.SegmentObject
:The MRC node for this stage.
- _build_sources(builder)[source]
Abstract method all derived Source classes should implement. Returns the same value as
build
.- Returns
mrc.SegmentObject
:The MRC nodes for this stage.
- build(builder, do_propagate=True)[source]
Build this stage.
- Parameters
- builder
mrc.Builder
MRC segment for this stage.
- do_propagatebool, optional
Whether to propagate to build output stages, by default True.
- builder
- can_build(check_ports=False)[source]
Determines if all inputs have been built allowing this node to be built.
- Parameters
- check_portsbool, optional
Check if we can build based on the input ports, by default False.
- Returns
- bool
True if we can build, False otherwise.
- can_pre_build(check_ports=False)[source]
Determines if all inputs have been built allowing this node to be built.
- Parameters
- check_portsbool, optional
Check if we can build based on the input ports, by default False.
- Returns
- bool
True if we can build, False otherwise.
- compute_schema(schema)[source]
Compute the schema for this stage based on the incoming schema from upstream stages.
Incoming schema and type information from upstream stages is available via the
schema.input_schemas
andschema.input_types
properties.Derived classes need to override this method, can set the output type(s) on
schema
by callingset_type
for all output ports. For example a simple pass-thru stage might perform the following:>>> for (port_idx, port_schema) in enumerate(schema.input_schemas): ... schema.output_schemas[port_idx].set_type(port_schema.get_type()) >>>
If the port types in
upstream_schema
are incompatible the stage should raise aRuntimeError
.
- property df_type_str: Literal['cudf', 'pandas']
Returns the DataFrame module that should be used for the given execution mode.
- get_all_input_stages()[source]
Get all input stages to this stage.
- Returns
- list[
morpheus.pipeline.pipeline.StageBase
] All input stages.
- list[
- get_all_inputs()[source]
Get all input senders to this stage.
- Returns
- list[
morpheus.pipeline.pipeline.Sender
] All input senders.
- list[
- get_all_output_stages()[source]
Get all output stages from this stage.
- Returns
- list[
morpheus.pipeline.pipeline.StageBase
] All output stages.
- list[
- get_all_outputs()[source]
Get all output receivers from this stage.
- Returns
- list[
morpheus.pipeline.pipeline.Receiver
] All output receivers.
- list[
- get_df_class()[source]
Returns the DataFrame class that should be used for the given execution mode.
- get_df_pkg()[source]
Returns the DataFrame package that should be used for the given execution mode.
- get_needed_columns()[source]
Stages which need to have columns inserted into the dataframe, should populate the
self._needed_columns
dictionary with mapping of column names tomorpheus.common.TypeId
. This will ensure that the columns are allocated and populated with null values.
- property has_multi_input_ports: bool
Indicates if this stage has multiple input ports.
- Returns
- bool
True if stage has multiple input ports, False otherwise.
- property has_multi_output_ports: bool
Indicates if this stage has multiple output ports.
- Returns
- bool
True if stage has multiple output ports, False otherwise.
- property input_count: int
Return None for no max intput count.
- property input_ports: list[morpheus.pipeline.receiver.Receiver]
Input ports to this stage.
- Returns
- list[
morpheus.pipeline.pipeline.Receiver
] Input ports to this stage.
- list[
- property is_built: bool
Indicates if this stage has been built.
- Returns
- bool
True if stage is built, False otherwise.
- property is_pre_built: bool
Indicates if this stage has been built.
- Returns
- bool
True if stage is built, False otherwise.
- is_stop_requested()[source]
Returns
True
if a stop has been requested.- Returns
- bool:
True if a stop has been requested, False otherwise.
- async join()[source]
Awaitable method that stages can implement this to perform cleanup steps when pipeline is stopped. Typically this is called after
stop
during a graceful shutdown, but may not be called if the pipeline is terminated on its own.
- property name: str
Unique name of the stage
- property output_ports: list[morpheus.pipeline.sender.Sender]
Output ports from this stage.
- Returns
- list[
morpheus.pipeline.pipeline.Sender
] Output ports from this stage.
- list[
- request_stop()[source]
Request the source to stop processing data.
- set_needed_columns(needed_columns)[source]
Sets the columns needed to perform preallocation. This should only be called by the Pipeline at build time. The needed_columns shoudl contain the entire set of columns needed by any other stage in this segment.
- async start_async()[source]
This function is called along with on_start during stage initialization. Allows stages to utilize the asyncio loop if needed.
- stop()[source]
This method is invoked by the pipeline whenever there is an unexpected shutdown. Subclasses should override this method to perform any necessary cleanup operations.
- supported_execution_modes()[source]
Returns a tuple of supported execution modes of this stage.
- supports_cpp_node()[source]
Indicates whether or not this stage supports a C++ implementation
- property unique_name: str
Unique name of stage. Generated by appending stage id to stage name.
- Returns
- str
Unique name of stage.