Activation functions¶
Common activation functions for neural networks.
Most of those are wrappers around standard dynet operations
(eg. rectify
-> relu
)
-
dynn.activations.
identity
(x)¶ The identity function
\(y=x\)
Parameters: x ( dynet.Expression
) – Input expressionReturns: \(x\) Return type: dynet.Expression
-
dynn.activations.
relu
(x)¶ The REctified Linear Unit
\(y=\max(0,x)\)
Parameters: x ( dynet.Expression
) – Input expressionReturns: \(\max(0,x)\) Return type: dynet.Expression
-
dynn.activations.
sigmoid
(x)¶ The sigmoid function
\(y=\frac{1}{1+e^{-x}}\)
Parameters: x ( dynet.Expression
) – Input expressionReturns: \(\frac{1}{1+e^{-x}}\) Return type: dynet.Expression
-
dynn.activations.
tanh
(x)¶ The hyperbolic tangent function
\(y=\tanh(x)\)
Parameters: x ( dynet.Expression
) – Input expressionReturns: \(\tanh(x)\) Return type: dynet.Expression