Base layer

class dynn.layers.base_layers.BaseLayer(name)

Bases: object

Base layer interface

__call__(*args, **kwargs)

Execute forward pass

__init__(name)

Initialize self. See help(type(self)) for accurate signature.

__weakref__

list of weak references to the object (if defined)

init(test=True, update=False)

Initialize the layer before performing computation

For example setup dropout, freeze some parameters, etc…

init_layer(test=True, update=False)

Initializes only this layer’s parameters (not recursive) This needs to be implemented for each layer

sublayers

Returns all attributes of the layer which are layers themselves

class dynn.layers.base_layers.ParametrizedLayer(pc, name)

Bases: dynn.layers.base_layers.BaseLayer

This is the base class for layers with trainable parameters

When implementing a ParametrizedLayer, use self.add_parameters / self.add_lookup_parameters to add parameters to the layer.

__init__(pc, name)

Creates a subcollection for this layer with a custom name

add_lookup_parameters(name, dim, lookup_param=None, init=None, device='', scale=1.0, mean=0.0, std=1.0)

This adds a parameter to this layer’s parametercollection

The layer will have 1 new attribute: self.[name] which will contain the lookup parameter object (which you should use in __call__).

You can provide an existing lookup parameter with the lookup_param argument, in which case this parameter will be reused.

The other arguments are the same as dynet.ParameterCollection.add_lookup_parameters

add_parameters(name, dim, param=None, init=None, device='', scale=1.0, mean=0.0, std=1.0)

This adds a parameter to this layer’s ParameterCollection.

The layer will have 1 new attribute: self.[name] which will contain the expression for this parameter (which you should use in __call__).

You can provide an existing parameter with the param argument, in which case this parameter will be reused.

The other arguments are the same as dynet.ParameterCollection.add_parameters

init_layer(test=True, update=False)

Initializes only this layer’s parameters (not recursive) This needs to be implemented for each layer

lookup_parameters

Return all lookup parameters specific to this layer

parameters

Return all parameters specific to this layer