如何在theano上实现加权二进制交叉熵? [英] How to implement Weighted Binary CrossEntropy on theano?

查看:190
本文介绍了如何在theano上实现加权二进制交叉熵?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如何在theano上实现加权二进制交叉熵?

我的卷积神经网络只能预测0 ~~ 1(S型).

My Convolutional neural network only predict 0 ~~ 1 (sigmoid).

我想用这种方式来惩罚我的预测:

基本上,当模型预测为0但事实为1时,我想惩罚更多.

Basically, i want to penalize MORE when the model predicts 0 but the truth was 1.

问题:如何使用theano和烤宽面条创建此加权二进制交叉熵函数?

Question : How can I create this Weighted Binary CrossEntropy function using theano and lasagne ?

我在下面尝试了此操作

prediction = lasagne.layers.get_output(model)


import theano.tensor as T
def weighted_crossentropy(predictions, targets):

    # Copy the tensor
    tgt = targets.copy("tgt")

    # Make it a vector
    # tgt = tgt.flatten()
    # tgt = tgt.reshape(3000)
    # tgt = tgt.dimshuffle(1,0)

    newshape = (T.shape(tgt)[0])
    tgt = T.reshape(tgt, newshape)

   #Process it so [index] < 0.5 = 0 , and [index] >= 0.5 = 1


    # Make it an integer.
    tgt = T.cast(tgt, 'int32')


    weights_per_label = theano.shared(lasagne.utils.floatX([0.2, 0.4]))

    weights = weights_per_label[tgt]  # returns a targets-shaped weight matrix
    loss = lasagne.objectives.aggregate(T.nnet.binary_crossentropy(predictions, tgt), weights=weights)

    return loss

loss_or_grads = weighted_crossentropy(prediction, self.target_var)

但是我在下面收到此错误:

TypeError:重塑中的新形状必须是矢量或标量列表/元组.转换为向量后得到Subtensor {int64} .0.

参考资料: https://github.com/fchollet/keras/issues/2115

参考文献: https://groups.google.com/forum/#!topic/theano-users/R_Q4uG9BXp8

推荐答案

感谢千层面的开发人员,我通过构建自己的损失函数来解决了这个问题.

Thanks to the developers on lasagne group, i fixed this by constructing my own loss function.

loss_or_grads = -(customized_rate * target_var * tensor.log(prediction) + (1.0 - target_var) * tensor.log(1.0 - prediction))

loss_or_grads = loss_or_grads.mean()

这篇关于如何在theano上实现加权二进制交叉熵?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆