Activation¶
Apply an activation function on an input tensor A and produce an output tensor B with the same dimensions.
See also
PRelu, SoftMax
Attributes¶
type activation function can be one of:
RELU\(output=max(0, input)\)SIGMOID\(output=\frac{1}{1+e^{-input}}\)TANH\(output=\frac{1-e^{-2 \cdot input}}{1+e^{-2 \cdot input}}\)LEAKY_RELU\(output=input \text{ if } input\geq0 \text{ else } \alpha \cdot input\)ELU\(output=input \text{ if } input\geq0 \text{ else } \alpha \cdot (e^{input} -1)\)SELU\(output=\beta \cdot input \text{ if } input\geq0 \text{ else } \beta \cdot (\alpha \cdot e^{input} - \alpha)\)SOFTSIGN\(output=\frac{input}{1+|input|}\)SOFTPLUS\(output=\alpha \cdot log(e^{\beta \cdot input} + 1)\)CLIP\(output=max(\alpha, min(\beta, input))\)HARD_SIGMOID\(output=max(0, min(1, \alpha \cdot input +\beta))\)SCALED_TANH\(output=\alpha \cdot tanh(\beta \cdot input)\)THRESHOLDED_RELU\(output=max(0, input - \alpha)\)
alpha parameter used when the activation function is one of: LEAKY_RELU, ELU, SELU, SOFTPLUS, CLIP, HARD_SIGMOID, SCALED_TANH, THRESHOLDED_RELU
beta parameter used when the activation function is one of: SELU, SOFTPLUS, CLIP, HARD_SIGMOID, SCALED_TANH
Inputs¶
input: tensor of type T1
Outputs¶
output: tensor of type T1
Data Types¶
T1: int8, float16, float32, bfloat16
Note:
int32andint64are also supported only forRELU.
Shape Information¶
The output has the same shape as the input.
DLA Support¶
DLA FP16 and DLA INT8 are supported.
DLA supports the following activation types:
CLIPwhere \(\alpha=0\) and \(\beta\leq127\)RELUSIGMOIDTANHLEAKY_RELU
Examples¶
Activation
in1 = network.add_input("input1", dtype=trt.float32, shape=(2, 3))
layer = network.add_activation(in1, type=trt.ActivationType.RELU)
network.mark_output(layer.get_output(0))
inputs[in1.name] = np.array([[-3.0, -2.0, -1.0], [0.0, 1.0, 2.0]])
outputs[layer.get_output(0).name] = layer.get_output(0).shape
expected[layer.get_output(0).name] = np.array([[0.0, 0.0, 0.0], [0.0, 1.0, 2.0]])
C++ API¶
For more information about the C++ IActivationLayer operator, refer to the C++ IActivationLayer documentation.
Python API¶
For more information about the Python IActivationLayer operator, refer to the Python IActivationLayer documentation.