keras中的多个嵌入层 [英] Multiple embedding layers in keras

查看:262
本文介绍了keras中的多个嵌入层的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

使用预训练的嵌入,我们可以在keras的嵌入层中将它们指定为权重.要使用多个嵌入,指定多个嵌入层是否合适?即

With pretrained embeddings, we can specify them as weights in keras' embedding layer. To use multiple embeddings, would specifying multiple embedding layer be suitable? i.e.

embedding_layer1 = Embedding(len(word_index) + 1,
                        EMBEDDING_DIM,
                        weights=[embedding_matrix_1],
                        input_length=MAX_SEQUENCE_LENGTH,
                        trainable=False)

 embedding_layer2 = Embedding(len(word_index) + 1,
                        EMBEDDING_DIM,
                        weights=[embedding_matrix_2],
                        input_length=MAX_SEQUENCE_LENGTH,
                        trainable=False)

 model.add(embedding_layer1)
 model.add(embedding_layer2)

建议将它们汇总并表示为单个层,这不是我想要的.

This suggests to sum them up and represent them into a single layer, which is not what I am after.

推荐答案

下面是利用Keras的功能API 通过多个输入使用多个嵌入层的示例.这是针对Kaggle竞赛的,因此您必须通读代码.他们向网络提供字典,字典中包含每个数据输入的键.这非常聪明,我能够使用性能良好的框架构建一个单独的模型.

Here is an example of using multiple embedding layers through multiple inputs by leveraging Keras' functional API. This is for a Kaggle competition so you'll have to read through the code. They feed the network a dictionary with a key for each data input. It's quite clever and I was able to build a separate model using this framework that performed well.

deep-learning-support-9663

这篇关于keras中的多个嵌入层的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆