Densely connected layers¶
-
class
dynn.layers.dense_layers.
Affine
(pc, input_dim, output_dim, activation=<function identity>, dropout=0.0, nobias=False, W=None, b=None)¶ Bases:
dynn.layers.base_layers.ParametrizedLayer
Densely connected layer
\(y=f(Wx+b)\)
Parameters: - pc (
dynet.ParameterCollection
) – Parameter collection to hold the parameters - input_dim (int) – Input dimension
- output_dim (int) – Output dimension
- activation (function, optional) – activation function (default: :py:function:`identity`)
- dropout (float, optional) – Dropout rate (default 0)
- nobias (bool, optional) – Omit the bias (default
False
)
-
__call__
(x)¶ Forward pass.
Parameters: x ( dynet.Expression
) – Input expression (a vector)Returns: \(y=f(Wx+b)\) Return type: dynet.Expression
-
__init__
(pc, input_dim, output_dim, activation=<function identity>, dropout=0.0, nobias=False, W=None, b=None)¶ Creates a subcollection for this layer with a custom name
- pc (
-
class
dynn.layers.dense_layers.
GatedLayer
(pc, input_dim, output_dim, activation=<built-in function tanh>, dropout=0.0, Wo=None, bo=None, Wg=None, bg=None)¶ Bases:
dynn.layers.base_layers.ParametrizedLayer
Gated linear layer:
\(y=(W_ox+b_o)\circ \sigma(W_gx+b_g)\)
Parameters: - pc (
dynet.ParameterCollection
) – Parameter collection to hold the parameters - input_dim (int) – Input dimension
- output_dim (int) – Output dimension
- activation (function, optional) – activation function
(default:
dynet.tanh
) - dropout (float, optional) – Dropout rate (default 0)
-
__call__
(x)¶ Forward pass
Parameters: x ( dynet.Expression
) – Input expression (a vector)Returns: \(y=(W_ox+b_o)\circ \sigma(W_gx+b_g)\) Return type: dynet.Expression
-
__init__
(pc, input_dim, output_dim, activation=<built-in function tanh>, dropout=0.0, Wo=None, bo=None, Wg=None, bg=None)¶ Creates a subcollection for this layer with a custom name
- pc (