张量流中的加权成本函数 [英] Weighted cost function in tensorflow

查看:29
本文介绍了张量流中的加权成本函数的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试将权重引入以下成本函数:

I'm trying to introduce weighting into the following cost function:

_cost = tf.reduce_mean(tf.nn.sparse_softmax_cross_entropy_with_logits(logits=_logits, labels=y))

但不必自己做 softmax 交叉熵.所以我正在考虑将成本计算分解为成本 1 和成本 2 并将我的 logits 和 y 值的修改版本提供给每个.

But without having to do the softmax cross entropy myself. So I was thinking of breaking the cost calc up into cost1 and cost2 and feeding in a modified version of my logits and y values to each one.

我想做这样的事情,但不确定什么是正确的代码:

I want to do something like this but not sure what is the correct code:

mask=(y==0)
y0 = tf.boolean_mask(y,mask)*y1Weight

(这给出了掩码不能为标量的错误)

(This gives the error that mask cannot be scalar)

推荐答案

可以使用 tf.where 计算权重掩码.这是加权成本示例:

The weight masks can be computed using tf.where. Here is the weighted cost example:

batch_size = 100
y1Weight = 0.25
y0Weight = 0.75


_logits = tf.Variable(tf.random_normal(shape=(batch_size, 2), stddev=1.))
y = tf.random_uniform(shape=(batch_size,), maxval=2, dtype=tf.int32)

_cost = tf.nn.sparse_softmax_cross_entropy_with_logits(logits=_logits, labels=y)

#Weight mask, the weights for label=0 is y0Weight and for 1 is y1Weight
y_w = tf.where(tf.cast(y, tf.bool), tf.ones((batch_size,))*y0Weight, tf.ones((batch_size,))*y1Weight)

# New weighted cost
cost_w = tf.reduce_mean(tf.multiply(_cost, y_w))

正如@user1761806 所建议的,更简单的解决方案是使用 tf.losses.sparse_softmax_cross_entropy(),它允许对类进行加权.

As suggested by @user1761806, the simpler solution would be to use tf.losses.sparse_softmax_cross_entropy() which has allows weighting of the classes.

这篇关于张量流中的加权成本函数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆