TensorRT 8.2.5
nvinfer1::IActivationLayer Class Reference

An Activation layer in a network definition. More...

#include <NvInfer.h>

Inheritance diagram for nvinfer1::IActivationLayer:
nvinfer1::ILayer nvinfer1::INoCopy

Public Member Functions

void setActivationType (ActivationType type) noexcept
 Set the type of activation to be performed. More...
 
ActivationType getActivationType () const noexcept
 Get the type of activation to be performed. More...
 
void setAlpha (float alpha) noexcept
 Set the alpha parameter (must be finite). More...
 
void setBeta (float beta) noexcept
 Set the beta parameter (must be finite). More...
 
float getAlpha () const noexcept
 Get the alpha parameter. More...
 
float getBeta () const noexcept
 Get the beta parameter. More...
 
- Public Member Functions inherited from nvinfer1::ILayer
LayerType getType () const noexcept
 Return the type of a layer. More...
 
void setName (const char *name) noexcept
 Set the name of a layer. More...
 
const char * getName () const noexcept
 Return the name of a layer. More...
 
int32_t getNbInputs () const noexcept
 Get the number of inputs of a layer.
 
ITensorgetInput (int32_t index) const noexcept
 Get the layer input corresponding to the given index. More...
 
int32_t getNbOutputs () const noexcept
 Get the number of outputs of a layer.
 
ITensorgetOutput (int32_t index) const noexcept
 Get the layer output corresponding to the given index. More...
 
void setInput (int32_t index, ITensor &tensor) noexcept
 Replace an input of this layer with a specific tensor. More...
 
void setPrecision (DataType dataType) noexcept
 Set the computational precision of this layer. More...
 
DataType getPrecision () const noexcept
 get the computational precision of this layer More...
 
bool precisionIsSet () const noexcept
 whether the computational precision has been set for this layer More...
 
void resetPrecision () noexcept
 reset the computational precision for this layer More...
 
void setOutputType (int32_t index, DataType dataType) noexcept
 Set the output type of this layer. More...
 
DataType getOutputType (int32_t index) const noexcept
 get the output type of this layer More...
 
bool outputTypeIsSet (int32_t index) const noexcept
 whether the output type has been set for this layer More...
 
void resetOutputType (int32_t index) noexcept
 reset the output type for this layer More...
 

Protected Attributes

apiv::VActivationLayer * mImpl
 
- Protected Attributes inherited from nvinfer1::ILayer
apiv::VLayer * mLayer
 

Additional Inherited Members

- Protected Member Functions inherited from nvinfer1::INoCopy
 INoCopy (const INoCopy &other)=delete
 
INoCopyoperator= (const INoCopy &other)=delete
 
 INoCopy (INoCopy &&other)=delete
 
INoCopyoperator= (INoCopy &&other)=delete
 

Detailed Description

An Activation layer in a network definition.

This layer applies a per-element activation function to its input.

The output has the same shape as the input.

Warning
Do not inherit from this class, as doing so will break forward-compatibility of the API and ABI.

Member Function Documentation

◆ getActivationType()

ActivationType nvinfer1::IActivationLayer::getActivationType ( ) const
inlinenoexcept

Get the type of activation to be performed.

See also
setActivationType(), ActivationType

◆ getAlpha()

float nvinfer1::IActivationLayer::getAlpha ( ) const
inlinenoexcept

Get the alpha parameter.

See also
getBeta(), setAlpha()

◆ getBeta()

float nvinfer1::IActivationLayer::getBeta ( ) const
inlinenoexcept

Get the beta parameter.

See also
getAlpha(), setBeta()

◆ setActivationType()

void nvinfer1::IActivationLayer::setActivationType ( ActivationType  type)
inlinenoexcept

Set the type of activation to be performed.

On the DLA, the valid activation types are kRELU, kSIGMOID, kTANH, and kCLIP.

See also
getActivationType(), ActivationType

◆ setAlpha()

void nvinfer1::IActivationLayer::setAlpha ( float  alpha)
inlinenoexcept

Set the alpha parameter (must be finite).

This parameter is used by the following activations: LeakyRelu, Elu, Selu, Softplus, Clip, HardSigmoid, ScaledTanh, ThresholdedRelu.

It is ignored by the other activations.

See also
getAlpha(), setBeta()

◆ setBeta()

void nvinfer1::IActivationLayer::setBeta ( float  beta)
inlinenoexcept

Set the beta parameter (must be finite).

This parameter is used by the following activations: Selu, Softplus, Clip, HardSigmoid, ScaledTanh.

It is ignored by the other activations.

See also
getBeta(), setAlpha()

The documentation for this class was generated from the following file: