张量流中LSTM的正则化 [英] Regularization for LSTM in tensorflow

查看:395
本文介绍了张量流中LSTM的正则化的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

Tensorflow提供了一个不错的LSTM包装器.

Tensorflow offers a nice LSTM wrapper.

rnn_cell.BasicLSTM(num_units, forget_bias=1.0, input_size=None,
           state_is_tuple=False, activation=tanh)

我想使用正则化,例如L2正则化.但是,我无法直接访问LSTM单元中使用的不同权重矩阵,因此我无法明确地执行类似的操作

I would like to use regularization, say L2 regularization. However, I don't have direct access to the different weight matrices used in the LSTM cell, so I cannot explicitly do something like

loss = something + beta * tf.reduce_sum(tf.nn.l2_loss(weights))

是否可以通过LSTM以某种方式访问​​矩阵或使用正则化?

Is there a way to access the matrices or use regularization somehow with LSTM?

推荐答案

tf.trainable_variables 为您提供了Variable对象的列表,可用于添加L2正则化项.请注意,这会为模型中的所有变量添加正则化.如果您想仅将L2项限制为权重的子集,则可以使用

tf.trainable_variables gives you a list of Variable objects that you can use to add the L2 regularization term. Note that this add regularization for all variables in your model. If you want to restrict the L2 term only to a subset of the weights, you can use the name_scope to name your variables with specific prefixes, and later use that to filter the variables from the list returned by tf.trainable_variables.

这篇关于张量流中LSTM的正则化的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆