caffe什么是SetLossWeights? [英] caffe what is SetLossWeights?

查看:113
本文介绍了caffe什么是SetLossWeights?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在阅读caffe的Layer源代码,但遇到以下问题:


  1. 什么是 Layer :: SetLossWeights 函数在做什么?我知道在Layer类中,有一个 loss _ 变量,该变量记录了:



< blockquote>

向量,指示每个顶部Blob在目标函数中的权重
是否为非零。


它们是否存在某些关系?


  1. 在caffe.proto文件中,LayerParameter loss_weight仅用于损失层,对吗?

非常感谢。

解决方案


  1. 损失权重的目的是合并多层损失。因此, Layer :: SetLossWeights 将损失权重分配给 loss _ 变量和 diff blob 用于转发中以计算总损失。

  2. 作为具有后缀损失的默认图层,其损失权重为1,其他层的值为0。但是,任何能够反向传播的层都可以赋予非零的loss_weight。

有关详细信息,请参见咖啡损耗教程



编辑:



损耗重量仅在输入到执行反向支撑的另一层时才会发生变化,作者。正如他们在精度层所说的那样rel = nofollow noreferrer> pull请求会中断。 损失层中的差异的目的是存储损失权重不存储渐变。有关更多详细信息,请参见 caffe用户组


I am reading caffe's Layer source code but I got following questions:

  1. What is Layer::SetLossWeights function doing? I know that inside Layer class, there is a loss_ variable, which documents:

The vector that indicates whether each top blob has a non-zero weight in the objective function.

Do they have some relationships ?

  1. Inside the caffe.proto file, LayerParameter loss_weight is only for loss layers, is that correct?

Thanks very much.

解决方案

  1. The purpose of loss weight is to combine loss from multiple layers. So Layer::SetLossWeights is assigning the loss weight to loss_ variable and diff blob which is used in forward to compute total loss.
  2. As default layers with suffix loss have loss weight 1 and others with 0. But any layer that is able to backpropagate can be given a non-zero loss_weight.

For detail information see caffe loss tutorial.

Edit:

Loss weight will only change if it is input to another layer that does backprop which is not intended by the authors. As they said for Accuracy layer in this pull request it will break. The purpose of the diff in loss layer is to store loss weight not store gradient. For more detail you can see this discussion in caffe-users group.

这篇关于caffe什么是SetLossWeights?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆