PluginV2¶
Adds a node with custom functionality to TensorRT.
Attributes¶
Plugin specific attributes.
Inputs¶
Plugin specific inputs.
Outputs¶
Plugin specific outputs.
Data Types¶
Plugin specific data types.
Shape Information¶
Plugin specific shapes.
Volume Limits¶
Plugin specific limits.
DLA Support¶
Plugin specific restrictions.
Examples¶
Plugin
def init_gelu_plugin(bias):
for plugin_creator in trt.get_plugin_registry().plugin_creator_list:
if plugin_creator.name == "CustomGeluPluginDynamic":
gelu_data_type = trt.PluginField("type_id", np.array([0], dtype=np.int32), type=trt.PluginFieldType.INT8)
gelu_bias = trt.PluginField("bias", bias, type=trt.PluginFieldType.FLOAT32)
field_collection = trt.PluginFieldCollection([gelu_data_type, gelu_bias])
return plugin_creator.create_plugin(name="CustomGeluPluginDynamic", field_collection=field_collection)
raise Exception('CustomGeluPluginDynamic Plugin not found')
bias = np.array([[[1, -0.5, 0, 0, 0.7]]], dtype=np.float32)
input_layer = network.add_input(name="input_layer", dtype=trt.float32, shape=(1, 1, 5))
lrelu = network.add_plugin_v2(inputs=[input_layer], plugin=init_gelu_plugin(bias))
network.mark_output(lrelu.get_output(0))
inputs[input_layer.name] = np.array(
[[
[3.0, -4.3, 22.8, -2.97, 143.2],
]]
)
outputs[lrelu.get_output(0).name] = lrelu.get_output(0).shape
input_with_bias = inputs[input_layer.name] + bias
expected_output = 0.5 * input_with_bias * (1 + np.tanh(np.sqrt(2 / math.pi) * (input_with_bias + 0.044715 * np.power(input_with_bias, 3))))
expected[lrelu.get_output(0).name] = expected_output
C++ API¶
For more information about the C++ IPluginV2Layer operator, refer to the C++ IPluginV2Layer documentation.
Python API¶
For more information about the Python IPluginV2Layer operator, refer to the Python IPluginV2Layer documentation.