caffe什么是SetLossWeights? [英] caffe what is SetLossWeights?
问题描述
我正在阅读caffe的Layer源代码,但遇到以下问题:
- 什么是
Layer :: SetLossWeights
函数在做什么?我知道在Layer类中,有一个loss _
变量,该变量记录了:
< blockquote>
向量,指示每个顶部Blob在目标函数中的权重
是否为非零。
它们是否存在某些关系?
- 在caffe.proto文件中,LayerParameter loss_weight仅用于损失层,对吗?
非常感谢。
- 损失权重的目的是合并多层损失。因此,
Layer :: SetLossWeights
将损失权重分配给loss _
变量和diff blob
用于转发
中以计算总损失。 - 作为具有后缀损失的默认图层,其损失权重为1,其他层的值为0。但是,任何能够反向传播的层都可以赋予非零的loss_weight。
有关详细信息,请参见咖啡损耗教程。
编辑:
损耗重量
仅在输入到执行反向支撑的另一层时才会发生变化,作者。正如他们在损失层
中的差异
的目的是存储损失权重
不存储渐变。有关更多详细信息,请参见 caffe用户组。
I am reading caffe's Layer source code but I got following questions:
- What is
Layer::SetLossWeights
function doing? I know that inside Layer class, there is aloss_
variable, which documents:
The vector that indicates whether each top blob has a non-zero weight in the objective function.
Do they have some relationships ?
- Inside the caffe.proto file, LayerParameter loss_weight is only for loss layers, is that correct?
Thanks very much.
- The purpose of loss weight is to combine loss from multiple layers. So
Layer::SetLossWeights
is assigning the loss weight toloss_
variable anddiff blob
which is used inforward
to compute total loss. - As default layers with suffix loss have loss weight 1 and others with 0. But any layer that is able to backpropagate can be given a non-zero loss_weight.
For detail information see caffe loss tutorial.
Edit:
Loss weight
will only change if it is input to another layer that does backprop which is not intended by the authors. As they said for Accuracy layer
in this pull request it will break. The purpose of the diff
in loss layer
is to store loss weight
not store gradient. For more detail you can see this discussion in caffe-users group.
这篇关于caffe什么是SetLossWeights?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!