如何在TensorFlow中添加正则化? [英] How to add regularizations in TensorFlow?
问题描述
我发现在许多使用TensorFlow实现的神经网络代码中,正则化项通常是通过在损失值上手动添加一个附加项来实现的.
我的问题是:
-
是否有比手动完成更为优雅或推荐的正规化方法?
-
我还发现
get_variable
具有参数regularizer
.应该如何使用?根据我的观察,如果我们向其传递正则化器(例如tf.contrib.layers.l2_regularizer
,将计算表示正则化项的张量并将其添加到名为tf.GraphKeys.REGULARIZATOIN_LOSSES
的图集合中.该集合是否会被TensorFlow自动使用(例如,培训时使用优化程序)还是希望我自己使用该集合?
如第二点所述,建议使用regularizer
参数.您可以在get_variable
中使用它,也可以在variable_scope
中将其设置一次,并对所有变量进行规范化.
损失在图中收集,您需要像这样将它们手动添加到成本函数中.
reg_losses = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)
reg_constant = 0.01 # Choose an appropriate one.
loss = my_normal_loss + reg_constant * sum(reg_losses)
希望有帮助!
I found in many available neural network code implemented using TensorFlow that regularization terms are often implemented by manually adding an additional term to loss value.
My questions are:
Is there a more elegant or recommended way of regularization than doing it manually?
I also find that
get_variable
has an argumentregularizer
. How should it be used? According to my observation, if we pass a regularizer to it (such astf.contrib.layers.l2_regularizer
, a tensor representing regularized term will be computed and added to a graph collection namedtf.GraphKeys.REGULARIZATOIN_LOSSES
. Will that collection be automatically used by TensorFlow (e.g. used by optimizers when training)? Or is it expected that I should use that collection by myself?
As you say in the second point, using the regularizer
argument is the recommended way. You can use it in get_variable
, or set it once in your variable_scope
and have all your variables regularized.
The losses are collected in the graph, and you need to manually add them to your cost function like this.
reg_losses = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)
reg_constant = 0.01 # Choose an appropriate one.
loss = my_normal_loss + reg_constant * sum(reg_losses)
Hope that helps!
这篇关于如何在TensorFlow中添加正则化?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!