Embedding layers¶
For embedding discrete inputs (such as words, characters).
-
class
dynn.layers.embedding_layers.
Embeddings
(pc, dictionary, embed_dim, init=None, pad_mask=None, E=None)¶ Bases:
dynn.layers.base_layers.ParametrizedLayer
Layer for embedding elements of a dictionary
Example:
# Dictionary dic = dynn.data.dictionary.Dictionary(symbols=["a", "b"]) # Parameter collection pc = dy.ParameterCollection() # Embedding layer of dimension 10 embed = Embeddings(pc,dic, 10) # Initialize dy.renew_cg() embed.init() # Return a batch of 2 10-dimensional vectors vectors = embed([dic.index("b"), dic.index("a")])
Parameters: - pc (
dynet.ParameterCollection
) – Parameter collection to hold the parameters - dictionary (
dynn.data.dictionary.Dictionary
) – Mapping from symbols to indices - embed_dim (int) – Embedding dimension
- init (
dynet.PyInitializer
, optional) – How to initialize the parameters. By default this will initialize to \(\mathcal N(0, \frac{\)}{sqrt{textt{embed_dim}}})` - pad_mask (float, optional) – If provided, embeddings of the
dictionary.pad_idx
index will be masked with this value
-
__call__
(idxs, length_dim=0)¶ Returns the input’s embedding
If
idxs
is a list this returns a batch of embeddings. If it’s a numpy array of shapeN x b
it returns a batch ofb
N x embed_dim
matricesParameters: idxs (list,int) – Index or list of indices to embed Returns: Batch of embeddings Return type: dynet.Expression
-
__init__
(pc, dictionary, embed_dim, init=None, pad_mask=None, E=None)¶ Creates a subcollection for this layer with a custom name
-
weights
¶ Numpy array containing the embeddings
The first dimension is the lookup dimension
- pc (