如何在 TensorFlow 中添加正则化? [英] How to add regularizations in TensorFlow?
问题描述
我在许多使用 TensorFlow 实现的可用神经网络代码中发现,正则化项通常是通过手动向损失值添加附加项来实现的.
I found in many available neural network code implemented using TensorFlow that regularization terms are often implemented by manually adding an additional term to loss value.
我的问题是:
有没有比手动更优雅或更推荐的正则化方法?
Is there a more elegant or recommended way of regularization than doing it manually?
我还发现 get_variable
有一个参数 regularizer
.应该如何使用?根据我的观察,如果我们向它传递一个正则化器(例如 tf.contrib.layers.l2_regularizer
,表示正则化项的张量将被计算并添加到名为 tf.GraphKeys.REGULARIZATOIN_LOSSES
.TensorFlow 会自动使用该集合吗(例如,优化器在训练时使用)?还是希望我自己使用该集合?
I also find that get_variable
has an argument regularizer
. How should it be used? According to my observation, if we pass a regularizer to it (such as tf.contrib.layers.l2_regularizer
, a tensor representing regularized term will be computed and added to a graph collection named tf.GraphKeys.REGULARIZATOIN_LOSSES
. Will that collection be automatically used by TensorFlow (e.g. used by optimizers when training)? Or is it expected that I should use that collection by myself?
推荐答案
正如你在第二点所说的,使用 regularizer
参数是推荐的方式.你可以在 get_variable
中使用它,或者在你的 variable_scope
中设置一次,然后让你的所有变量正则化.
As you say in the second point, using the regularizer
argument is the recommended way. You can use it in get_variable
, or set it once in your variable_scope
and have all your variables regularized.
损失收集在图中,您需要像这样手动将它们添加到成本函数中.
The losses are collected in the graph, and you need to manually add them to your cost function like this.
reg_losses = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)
reg_constant = 0.01 # Choose an appropriate one.
loss = my_normal_loss + reg_constant * sum(reg_losses)
希望有帮助!
这篇关于如何在 TensorFlow 中添加正则化?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!