Enum RnnActivation
Options for activation functions to apply in a recurrent layer.
Syntax
public enum RnnActivation
Fields
| Name |
Description |
Value |
| Relu |
Use Relu activation: f(x) = max(0, x).
|
0 |
| Tanh |
Use Tanh activation: f(x) = (1 - e^{-2x}) / (1 + e^{-2x}).
|
1 |
| Sigmoid |
Use Sigmoid activation: f(x) = 1 / (1 + e^{-x}).
|
2 |
| Affine |
Use Affine activation: f(x) = alpha * x + beta.
|
3 |
| LeakyRelu |
Use LeakyRelu activation: f(x) = x if x >= 0, otherwise f(x) = alpha * x.
|
4 |
| ThresholdedRelu |
Use ThresholdedRelu activation: f(x) = x if x >= alpha, otherwise f(x) = 0.
|
5 |
| ScaledTanh |
Use ScaledTanh activation: f(x) = alpha * tanh(beta * x).
|
6 |
| HardSigmoid |
Use HardSigmoid activation: f(x) = clamp(alpha * x + beta, 0, 1).
|
7 |
| Elu |
Use Elu activation: f(x) = x if x >= 0, otherwise f(x) = alpha * (e^x - 1).
|
8 |
| Softsign |
Use Softsign activation: f(x) = x / (1 + |x|).
|
9 |
| Softplus |
Use Softplus activation: f(x) = log(1 + e^x).
|
10 |