Activation functions

Common activation functions for neural networks.

Most of those are wrappers around standard dynet operations (eg. rectify -> relu)

dynn.activations.identity(x)

The identity function

\(y=x\)

Parameters:x (dynet.Expression) – Input expression
Returns:\(x\)
Return type:dynet.Expression
dynn.activations.relu(x)

The REctified Linear Unit

\(y=\max(0,x)\)

Parameters:x (dynet.Expression) – Input expression
Returns:\(\max(0,x)\)
Return type:dynet.Expression
dynn.activations.sigmoid(x)

The sigmoid function

\(y=\frac{1}{1+e^{-x}}\)

Parameters:x (dynet.Expression) – Input expression
Returns:\(\frac{1}{1+e^{-x}}\)
Return type:dynet.Expression
dynn.activations.tanh(x)

The hyperbolic tangent function

\(y=\tanh(x)\)

Parameters:x (dynet.Expression) – Input expression
Returns:\(\tanh(x)\)
Return type:dynet.Expression