(Latest Version)

morpheus.stages.input.appshield_source_stage.AppShieldSourceStage

class AppShieldSourceStage(c, input_glob, plugins_include, cols_include, cols_exclude=['SHA256'], watch_directory=False, max_files=- 1, sort_glob=False, recursive=True, queue_max_size=128, batch_timeout=5.0, encoding='latin1')[source]

Bases: morpheus.pipeline.preallocator_mixin.PreallocatorMixin, morpheus.pipeline.single_output_source.SingleOutputSource

Source stage is used to load Appshield messages from one or more plugins into a dataframe. It normalizes nested json messages and arranges them into a dataframe by snapshot and source.

Parameters
cmorpheus.config.Config

Pipeline configuration instance.

input_globstr

Input glob pattern to match files to read. For example, /input_dir/<source>/snapshot-*/*.json would read all files with the ‘json’ extension in the directory input_dir.

plugins_includeList[str], default = None

Plugins for appshield to be extracted.

cols_includeList[str], default = None

Raw features to extract from appshield plugins data.

cols_excludeList[str], default = [“SHA256”]

Columns that aren’t essential should be excluded.

watch_directorybool, default = False

The watch directory option instructs this stage to not close down once all files have been read. Instead it will read all files that match the ‘input_glob’ pattern, and then continue to watch the directory for additional files. Any new files that are added that match the glob will then be processed.

max_filesint, default = -1

Max number of files to read. Useful for debugging to limit startup time. Default value of -1 is unlimited.

sort_globbool, default = False

If true the list of files matching input_glob will be processed in sorted order.

recursivebool, default = True

If true, events will be emitted for the files in subdirectories matching input_glob.

queue_max_sizeint, default = 128

Maximum queue size to hold the file paths to be processed that match input_glob.

batch_timeoutfloat, default = 5.0

Timeout to retrieve batch messages from the queue.

encodingstr, default = latin1

Encoding to read a file.

Attributes
has_multi_input_ports

Indicates if this stage has multiple input ports.

has_multi_output_ports

Indicates if this stage has multiple output ports.

input_count

Return None for no max intput count

input_ports

Input ports to this stage.

is_built

Indicates if this stage has been built.

name

The name of the stage.

output_ports

Output ports from this stage.

unique_name

Unique name of stage.

Methods

batch_source_split(x, source)

Combines plugin dataframes from multiple snapshot and split dataframe per source.

build(builder[, do_propagate])

Build this stage.

can_build([check_ports])

Determines if all inputs have been built allowing this node to be built.

files_to_dfs(x, cols_include, cols_exclude, ...)

Load plugin files into a dataframe, then segment the dataframe by source.

fill_interested_cols(plugin_df, cols_include)

Fill missing interested plugin columns.

get_all_input_stages()

Get all input stages to this stage.

get_all_inputs()

Get all input senders to this stage.

get_all_output_stages()

Get all output stages from this stage.

get_all_outputs()

Get all output receivers from this stage.

get_needed_columns()

Stages which need to have columns inserted into the dataframe, should populate the self._needed_columns dictionary with mapping of column names to morpheus.common.TypeId.

join()

Awaitable method that stages can implement this to perform cleanup steps when pipeline is stopped.

load_df(filepath, cols_exclude, encoding)

Reads a file into a dataframe.

load_meta_cols(filepath_split, plugin, plugin_df)

Loads meta columns to dataframe.

read_file_to_df(file, cols_exclude)

Read file content to dataframe.

set_needed_columns(needed_columns)

Sets the columns needed to perform preallocation.

stop()

Stages can implement this to perform cleanup steps when pipeline is stopped.

supports_cpp_node()

Specifies whether this Stage is capable of creating C++ nodes.

_build(builder, in_ports_streams)[source]

This function is responsible for constructing this stage’s internal mrc.SegmentObject object. The input of this function contains the returned value from the upstream stage.

The input values are the mrc.Builder for this stage and a StreamPair tuple which contain the input mrc.SegmentObject object and the message data type.

Parameters
buildermrc.Builder

mrc.Builder object for the pipeline. This should be used to construct/attach the internal mrc.SegmentObject.

in_ports_streamsmorpheus.pipeline.pipeline.StreamPair

List of tuples containing the input mrc.SegmentObject object and the message data type.

Returns
typing.List[morpheus.pipeline.pipeline.StreamPair]

List of tuples containing the output mrc.SegmentObject object from this stage and the message data type.

_build_source(builder)[source]

Abstract method all derived Source classes should implement. Returns the same value as build.

Returns
morpheus.pipeline.pipeline.StreamPair:

A tuple containing the output mrc.SegmentObject object from this stage and the message data type.

static batch_source_split(x, source)[source]

Combines plugin dataframes from multiple snapshot and split dataframe per source.

Parameters
xtyping.List[pd.DataFrame]

Dataframes from multiple sources.

sourcestr

source column name to group it.

Returns
typing.Dict[str, pandas.DataFrame]

Grouped dataframes by source.

build(builder, do_propagate=True)[source]

Build this stage.

Parameters
buildermrc.Builder

MRC segment for this stage.

do_propagatebool, optional

Whether to propagate to build output stages, by default True.

can_build(check_ports=False)[source]

Determines if all inputs have been built allowing this node to be built.

Parameters
check_portsbool, optional

Check if we can build based on the input ports, by default False.

Returns
bool

True if we can build, False otherwise.

static files_to_dfs(x, cols_include, cols_exclude, plugins_include, encoding)[source]

Load plugin files into a dataframe, then segment the dataframe by source.

Parameters
xtyping.List[str]

List of file paths.

cols_includetyping.List[str]

Columns that needs to include.

cols_excludetyping.List[str]

Columns that needs to exclude.

encodingstr

Encoding to read a file.

Returns
typing.Dict[str, pandas.DataFrame]

Grouped dataframes by source.

static fill_interested_cols(plugin_df, cols_include)[source]

Fill missing interested plugin columns.

Parameters
plugin_dfpandas.DataFrame

Snapshot plugin dataframe

cols_includetyping.List[str]

Columns that needs to be included.

Returns
pandas.DataFrame

The columns added dataframe.

get_all_input_stages()[source]

Get all input stages to this stage.

Returns
typing.List[morpheus.pipeline.pipeline.StreamWrapper]

All input stages.

get_all_inputs()[source]

Get all input senders to this stage.

Returns
typing.List[morpheus.pipeline.pipeline.Sender]

All input senders.

get_all_output_stages()[source]

Get all output stages from this stage.

Returns
typing.List[morpheus.pipeline.pipeline.StreamWrapper]

All output stages.

get_all_outputs()[source]

Get all output receivers from this stage.

Returns
typing.List[morpheus.pipeline.pipeline.Receiver]

All output receivers.

get_needed_columns()[source]

Stages which need to have columns inserted into the dataframe, should populate the self._needed_columns dictionary with mapping of column names to morpheus.common.TypeId. This will ensure that the columns are allocated and populated with null values.

property has_multi_input_ports: bool

Indicates if this stage has multiple input ports.

Returns
bool

True if stage has multiple input ports, False otherwise.

property has_multi_output_ports: bool

Indicates if this stage has multiple output ports.

Returns
bool

True if stage has multiple output ports, False otherwise.

property input_count: int

Return None for no max intput count

property input_ports: List[morpheus.pipeline.receiver.Receiver]

Input ports to this stage.

Returns
typing.List[morpheus.pipeline.pipeline.Receiver]

Input ports to this stage.

property is_built: bool

Indicates if this stage has been built.

Returns
bool

True if stage is built, False otherwise.

async join()[source]

Awaitable method that stages can implement this to perform cleanup steps when pipeline is stopped. Typically this is called after stop during a graceful shutdown, but may not be called if the pipeline is terminated on its own.

static load_df(filepath, cols_exclude, encoding)[source]

Reads a file into a dataframe.

Parameters
filepathstr

Path to a file.

cols_excludetyping.List[str]

Columns that needs to exclude.

encodingstr

Encoding to read a file.

Returns
pandas.DataFrame

The parsed dataframe.

Raises
JSONDecodeError

If not able to decode the json file.

static load_meta_cols(filepath_split, plugin, plugin_df)[source]

Loads meta columns to dataframe.

Parameters
filepath_splittyping.List[str]

Splits of file path.

pluginstr

Plugin name to which the data belongs to.

Returns
pandas.DataFrame

The parsed dataframe.

property name: str

The name of the stage. Used in logging. Each derived class should override this property with a unique name.

Returns
str

Name of a stage.

property output_ports: List[morpheus.pipeline.sender.Sender]

Output ports from this stage.

Returns
typing.List[morpheus.pipeline.pipeline.Sender]

Output ports from this stage.

static read_file_to_df(file, cols_exclude)[source]

Read file content to dataframe.

Parameters
fileio.TextIOWrapper

Input file object

cols_excludetyping.List[str]

Dropping columns from a dataframe.

Returns
pandas.DataFrame

The columns added dataframe

set_needed_columns(needed_columns)[source]

Sets the columns needed to perform preallocation. This should only be called by the Pipeline at build time. The needed_columns shoudl contain the entire set of columns needed by any other stage in this segment.

stop()[source]

Stages can implement this to perform cleanup steps when pipeline is stopped.

supports_cpp_node()[source]

Specifies whether this Stage is capable of creating C++ nodes. During the build phase, this value will be combined with CppConfig.get_should_use_cpp() to determine whether or not a C++ node is created. This is an instance method to allow runtime decisions and derived classes to override base implementations.

property unique_name: str

Unique name of stage. Generated by appending stage id to stage name.

Returns
str

Unique name of stage.

© Copyright 2023, NVIDIA. Last updated on Apr 11, 2023.