tf.nn.embedding_lookup 函数有什么作用? [英] What does tf.nn.embedding_lookup function do?

查看:19
本文介绍了tf.nn.embedding_lookup 函数有什么作用?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

tf.nn.embedding_lookup(params, ids, partition_strategy='mod', name=None)

我无法理解此功能的职责.它像查找表吗?什么意思返回每个id对应的参数(in ids)?

I cannot understand the duty of this function. Is it like a lookup table? Which means to return the parameters corresponding to each id (in ids)?

例如,在skip-gram模型中,如果我们使用tf.nn.embedding_lookup(embeddings, train_inputs),那么对于每个train_input> 找到对应的嵌入?

For instance, in the skip-gram model if we use tf.nn.embedding_lookup(embeddings, train_inputs), then for each train_input it finds the correspond embedding?

推荐答案

embedding_lookup 函数检索 params 张量的行.该行为类似于在 numpy 中对数组使用索引.例如

embedding_lookup function retrieves rows of the params tensor. The behavior is similar to using indexing with arrays in numpy. E.g.

matrix = np.random.random([1024, 64])  # 64-dimensional embeddings
ids = np.array([0, 5, 17, 33])
print matrix[ids]  # prints a matrix of shape [4, 64] 

params 参数也可以是张量列表,在这种情况下 ids 将分布在张量中.例如,给定 3 个张量 [2, 64] 的列表,默认行为是它们将表示 ids:[0, 3], [1, 4], [2, 5].

params argument can be also a list of tensors in which case the ids will be distributed among the tensors. For example, given a list of 3 tensors [2, 64], the default behavior is that they will represent ids: [0, 3], [1, 4], [2, 5].

partition_strategy 控制 ids 在列表中的分布方式.当矩阵可能太大而无法保留为一个时,分区对于更大规模的问题很有用.

partition_strategy controls the way how the ids are distributed among the list. The partitioning is useful for larger scale problems when the matrix might be too large to keep in one piece.

这篇关于tf.nn.embedding_lookup 函数有什么作用?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆