Enum RnnActivation
Options for activation functions to apply in a recurrent layer.
Assembly: Unity.Sentis.dll
Syntax
public enum RnnActivation
Fields
| Name |
Description |
| Affine |
Use Affine activation: f(x) = alpha * x + beta.
|
| Elu |
Use Elu activation: f(x) = x if x >= 0, otherwise f(x) = alpha * (e^x - 1).
|
| HardSigmoid |
Use HardSigmoid activation: f(x) = clamp(alpha * x + beta, 0, 1).
|
| LeakyRelu |
Use LeakyRelu activation: f(x) = x if x >= 0, otherwise f(x) = alpha * x.
|
| Relu |
Use Relu activation: f(x) = max(0, x).
|
| ScaledTanh |
Use ScaledTanh activation: f(x) = alpha * tanh(beta * x).
|
| Sigmoid |
Use Sigmoid activation: f(x) = 1 / (1 + e^{-x}).
|
| Softplus |
Use Softplus activation: f(x) = log(1 + e^x).
|
| Softsign |
Use Softsign activation: f(x) = x / (1 + |x|).
|
| Tanh |
Use Tanh activation: f(x) = (1 - e^{-2x}) / (1 + e^{-2x}).
|
| ThresholdedRelu |
Use ThresholdedRelu activation: f(x) = x if x >= alpha, otherwise f(x) = 0.
|