TensorRT
7.1.3.0
|
An Activation layer in a network definition. More...
#include <NvInfer.h>
Public Member Functions | |
virtual void | setActivationType (ActivationType type)=0 |
Set the type of activation to be performed. More... | |
virtual ActivationType | getActivationType () const =0 |
Get the type of activation to be performed. More... | |
virtual void | setAlpha (float alpha)=0 |
Set the alpha parameter (must be finite). More... | |
virtual void | setBeta (float beta)=0 |
Set the beta parameter (must be finite). More... | |
virtual float | getAlpha () const =0 |
Get the alpha parameter. More... | |
virtual float | getBeta () const =0 |
Get the beta parameter. More... | |
Public Member Functions inherited from nvinfer1::ILayer | |
virtual LayerType | getType () const =0 |
Return the type of a layer. More... | |
virtual void | setName (const char *name)=0 |
Set the name of a layer. More... | |
virtual const char * | getName () const =0 |
Return the name of a layer. More... | |
virtual int | getNbInputs () const =0 |
Get the number of inputs of a layer. | |
virtual ITensor * | getInput (int index) const =0 |
Get the layer input corresponding to the given index. More... | |
virtual int | getNbOutputs () const =0 |
Get the number of outputs of a layer. | |
virtual ITensor * | getOutput (int index) const =0 |
Get the layer output corresponding to the given index. More... | |
virtual void | setInput (int index, ITensor &tensor)=0 |
Replace an input of this layer with a specific tensor. More... | |
virtual void | setPrecision (DataType dataType)=0 |
Set the computational precision of this layer. More... | |
virtual DataType | getPrecision () const =0 |
get the computational precision of this layer More... | |
virtual bool | precisionIsSet () const =0 |
whether the computational precision has been set for this layer More... | |
virtual void | resetPrecision ()=0 |
reset the computational precision for this layer More... | |
virtual void | setOutputType (int index, DataType dataType)=0 |
Set the output type of this layer. More... | |
virtual DataType | getOutputType (int index) const =0 |
get the output type of this layer More... | |
virtual bool | outputTypeIsSet (int index) const =0 |
whether the output type has been set for this layer More... | |
virtual void | resetOutputType (int index)=0 |
reset the output type for this layer More... | |
An Activation layer in a network definition.
This layer applies a per-element activation function to its input.
The output has the same shape as the input.
|
pure virtual |
Get the type of activation to be performed.
|
pure virtual |
Get the alpha parameter.
|
pure virtual |
Get the beta parameter.
|
pure virtual |
Set the type of activation to be performed.
On the DLA, the valid activation types are kRELU, kSIGMOID, kTANH, and kCLIP.
|
pure virtual |
Set the alpha parameter (must be finite).
This parameter is used by the following activations: LeakyRelu, Elu, Selu, Softplus, Clip, HardSigmoid, ScaledTanh, ThresholdedRelu.
It is ignored by the other activations.
|
pure virtual |
Set the beta parameter (must be finite).
This parameter is used by the following activations: Selu, Softplus, Clip, HardSigmoid, ScaledTanh.
It is ignored by the other activations.