张量流中的正则化损失是什么? [英] What is regularization loss in tensorflow?

查看:43
本文介绍了张量流中的正则化损失是什么?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当使用 Tensorflows 对象检测 API 训练对象检测 DNN 时,它的可视化平台 Tensorboard 会绘制一个名为 regularization_loss_1

When training an Object Detection DNN with Tensorflows Object Detection API it's Visualization Plattform Tensorboard plots a scalar named regularization_loss_1

这是什么?我知道正则化是什么(使网络擅长通过诸如 dropout 之类的各种方法进行泛化)但我不清楚这种显示的损失可能是什么.

What is this? I know what regularization is (to make the Network good at generalizing through various methods like dropout) But it is not clear to me what this displayed loss could be.

谢谢!

推荐答案

TL;DR:这只是正则化函数产生的额外损失.将其添加到网络的损失中并优化两者之和.

TL;DR: it's just the additional loss generated by the regularization function. Add that to the network's loss and optimize over the sum of the two.

如您所述,正则化方法用于帮助优化方法更好地泛化.获得它的一种方法是在损失函数中添加一个正则化项.这个术语是一个通用函数,它修改全局"损失(例如,网络损失正则化损失总和)以在所需方向上驱动优化算法.

As you correctly state, regularization methods are used to help an optimization method to generalize better. A way to obtain this is to add a regularization term to the loss function. This term is a generic function, which modifies the "global" loss (as in, the sum of the network loss and the regularization loss) in order to drive the optimization algorithm in desired directions.

比方说,例如,无论出于何种原因,我都想鼓励权重尽可能接近于零的优化解决方案.然后,一种方法是将网络权重的函数(例如,权重的所有绝对值按比例缩小的总和)添加到网络产生的损失中.由于优化算法最小化了全局损失,我的正则化项(当权重远非零时会很高)会将优化推向权重接近于零的解决方案.

Let's say, for example, that for whatever reason I want to encourage solutions to the optimization that have weights as close to zero as possible. One approach, then, is to add to the loss produced by the network, a function of the network weights (for example, a scaled-down sum of all the absolute values of the weights). Since the optimization algorithm minimizes the global loss, my regularization term (which is high when the weights are far from zero) will push the optimization towards solutions tht have weights close to zero.

这篇关于张量流中的正则化损失是什么?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆