深度网络的Marix的Matconvnet输出是统一值而不是变化值? [英] Matconvnet output of deep network's marix is uniform valued instead of varying values?

查看:120
本文介绍了深度网络的Marix的Matconvnet输出是统一值而不是变化值?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图从尺寸为20x20x1x50的网络输出中获得密度图.这里20x20是输出映射,而50是批量.

问题是每个输出矩阵上的输出X的值等于0.098..20x20.没有像密度图这样的高斯形状,而是20x20x1x50的平坦相似值的输出图.问题显示在附图中.我在这里想念什么?反向传播的欧几里得损失为:

  case {'l2loss'}
    res=(c-X);

    n=1;
    if isempty(dzdy) %forward
        Y = sum((res(:).^2))/numel(res);
    else
        Y_= -1.*(c-X);
        Y = 2*single (Y_ * (dzdy / n) );
    end

解决方案

在以下位置找到了解决方案 https://github.com/vlfeat/matconvnet/issues/313 .查询conv.var(i).value以查看该值所在的位置,然后在conv网络中编辑该层. 就我而言,我必须更改转换层的偏向

net2.params(8).value = 0.01 * init_bias * ones(1,128,'single');%'biases',

Im trying to achieve a density map from network output of dimension 20x20x1x50. Here 20x20 is the output map and 50 is the batch size.

The issue is that the value of output X is equal 0.098 across each output matrix..20x20. There is no gaussian shape like density map but a flat similar valued output map 20x20x1x50. The issue is shown in the figure attached. What am i missing here? The euclidean loss for backpropagation is given as:

  case {'l2loss'}
    res=(c-X);

    n=1;
    if isempty(dzdy) %forward
        Y = sum((res(:).^2))/numel(res);
    else
        Y_= -1.*(c-X);
        Y = 2*single (Y_ * (dzdy / n) );
    end

解决方案

Found the solution at https://github.com/vlfeat/matconvnet/issues/313. Query conv.var(i).value to see where the value falls, and edit that layer in the conv net. In my case I had to change biases of the conv layers

net2.params(8).value= 0.01*init_bias*ones(1, 128, 'single');%'biases',

这篇关于深度网络的Marix的Matconvnet输出是统一值而不是变化值?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆