core.activations#
Module Contents#
Functions#
Squared ReLU activation |
|
Quick GELU activation |
|
Fast GELU activation |
API#
- core.activations.squared_relu(x: torch.Tensor) torch.Tensor#
Squared ReLU activation
- core.activations.quick_gelu(x: torch.Tensor) torch.Tensor#
Quick GELU activation
- core.activations.fast_gelu(x: torch.Tensor) torch.Tensor#
Fast GELU activation