向 Tensorflow 中的损失函数添加常量 [英] Adding a constant to Loss function in Tensorflow

查看:33
本文介绍了向 Tensorflow 中的损失函数添加常量的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我问过类似的问题,但没有回应.所以我再试一次,

I have asked a similar question but no response. So I try it again,

我正在阅读一篇论文,其中建议将一些在 Tensorflow 之外计算的值添加到 Tensorflow 中神经网络模型的损失函数中.我给你看这里的报价(我已经模糊了不重要的部分):

I am reading a paper which suggest to add some value which is calculated outside of Tensorflow into the loss function of a neural network model in Tensorflow. i show you the quote here (I have blurred the not important part):

在 Tensorflow 中拟合序列模型时,如何向损失函数添加预先计算的值?使用的损失函数是 BinaryCrossentropy,你可以在论文引用的方程(4)中看到它.附加值显示在报价中,但对于我认为的问题并不重要.

How do I add a precalculated value to the loss function when fitting a sequential Model in Tensorflow? The Loss function used is BinaryCrossentropy, you can see it in the equation (4) in the paper quote. And the value added is shown in the quote but it is not important for the question i think.

我的模型看起来如何也不重要,我只想在拟合我的模型时在 tensorflow 中为我的损失函数添加一个常数值.

非常感谢!!

推荐答案

看来你希望能够定义你自己的损失.另外,我不确定您是使用实际的 Tensorflow 还是 Keras.这是 Keras 的解决方案:

It seems that you want to be able to define your own loss. Also, I am not sure whether you use actual Tensorflow or Keras. Here is a solution with Keras:

import tensorflow.keras.backend as K

def my_custom_loss(precomputed_value):
    def loss(y_true, y_pred):
        return K.binary_crossentropy(y_true, y_pred) + precomputed_value
    return loss

my_model = Sequential()
my_model.add(...)
# Add any layer there

my_model.compile(loss=my_custom_loss(42))

灵感来自 https://towardsdatascience.com/advanced-keras-constructing-complex-custom-losses-and-metrics-c07ca130a618

答案只是添加一个常数项,但我意识到论文中建议的术语不是常数.

The answer was only for adding a constant term, but I realize that the term suggested in the paper is not constant.

我没有读过论文,但我从交叉熵定义中推测 sigma 是基本事实,p 是预测值.如果没有其他依赖,解决方案甚至可以更简单:

I haven't read the paper, but I suppose from the cross-entropy definition that sigma is the ground truth and p is the predicted value. If there are no other dependency, the solution can even be simpler:

def my_custom_loss(y_pred, y_true):
    norm_term = K.square( K.mean(y_true) - K.mean(y_pred) )
    return K.binary_crossentropy(y_true, y_pred) + norm_term

# ...

my_model.compile(loss=my_custom_loss)

在这里,我假设期望值仅在每个批次上计算.告诉我这是否是你想要的.否则,如果您想以不同的规模计算统计数据,例如在每个时期之后的整个数据集上,您可能需要使用回调.在这种情况下,请更精确地解决您的问题,例如为 y_predy_true 添加一个小例子,以及预期的损失.

Here, I assumed the expectations are only computed on each batch. Tell me whether it is what you want. Otherwise, if you want to compute your statistics at a different scale, e.g. on the whole dataset after every epoch, you might need to use callbacks. In that case, please give more precision on your problem, adding for instance a small example for y_pred and y_true, and the expected loss.

这篇关于向 Tensorflow 中的损失函数添加常量的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆