Keras嵌入,“权重"在哪里?争论? [英] Keras Embedding ,where is the "weights" argument?
问题描述
我看到了如下代码:
embed_word = Embedding(params['word_voc_size'], params['embed_dim'], weights=[word_embed_matrix], input_length = params['word_max_size']
, trainable=False, mask_zero=True)
当我在Keras网站[ https://faroit.github.io/keras-docs/2.1.5/layers/embeddings/] [1]
When I look up the document in Keras website [https://faroit.github.io/keras-docs/2.1.5/layers/embeddings/][1]
我没有看到权重参数,
keras.layers.Embedding(input_dim, output_dim, embeddings_initializer='uniform', embeddings_regularizer=None, activity_regularizer=None, embeddings_constraint=None, mask_zero=False, input_length=None)
所以我很困惑,为什么我们可以使用Keras文档中未定义的参数权重?
So I am confused,why we can use the argument weights which was not defined the in Keras document?
我的keras版本是2.1.5.希望有人能帮助我.
My keras version is 2.1.5. Hope someone can help me.
推荐答案
Keras的Embedding
层是Layer
类的子类(每个Keras层都这样做). weights
属性是在此基类中实现的,因此每个子类都将允许通过weights
参数设置此属性.这也是为什么您不会在文档或Embedding
层本身的实现中找到它的原因.
Keras' Embedding
layer subclasses the Layer
class (every Keras layer does this). The weights
attribute is implemented in this base class, so every subclass will allow to set this attribute through a weights
argument. This is also why you won't find it back in the documentation or the implementation of the Embedding
layer itself.
您可以在这里(对于重量",按Ctrl + F).
You can check the base layer implementation here (Ctrl + F for 'weight').
这篇关于Keras嵌入,“权重"在哪里?争论?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!