Keras中自定义丢失函数中的访问层属性 [英] Access layer attribute in custom loss function in Keras

查看:32
本文介绍了Keras中自定义丢失函数中的访问层属性的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想在Keras中编写一个自定义损失函数,该函数取决于网络中(自定义)层的属性.

I want to write a custom loss function in Keras which depends on an attribute of a (custom) layer in the network.

想法如下:

  • 我有一个自定义层,可基于随机变量修改每个时期的输入
  • 应基于相同的变量修改输出标签

一些示例代码使其更加清晰:

Some example code to make it more clear:

import numpy as np
from keras import losses, layers, models

class MyLayer(layers.Layer):
    def call(self, x):
        a = np.random.rand()
        self.a = a # <-- does this work as expected?
        return x+a

def my_loss(layer):
    def modified_loss(y_true, y_pred):
        a = layer.a
        y_true = y_true + a
        return losses.mse(y_true, y_pred)

input_layer = layers.Input()
my_layer = MyLayer(input_layer, name="my_layer")
output_layer = layers.Dense(4)(my_layer)
model = models.Model(inputs=input_layer, outputs=output_layer)
model.compile('adam', my_loss(model.get_layer("my_layer")))

我希望每个批次的 a 都在变化,并且层和损失函数中使用相同的 a .目前,它没有按照我的预期工作.似乎损失函数中的 a 从未更新(甚至可能在层中也没有更新).

I expect that a is changing for every batch and that the same a is used in the layer and loss function. Right now, it is not working the way I intended. It seems like the a in the loss function is never updated (and maybe not even in the layer).

如何在每次调用时更改图层中 a 的属性/值,并在损失函数中访问它?

How do I change the attribute/value of a in the layer at every call and access it in the loss function?

推荐答案

我不太确定我是否遵循此目的(并且对 call中对 np 的调用感到困扰()的自定义图层-您可以代替使用 tf.random 函数吗?),但是您当然可以访问损失函数中的 a 属性.

Not quite sure I am following the purpose on this (and I am bothered by the call to np inside the call() of your custom layer - could you not use the tf.random functions instead?) but you can certainly access the a property inside your loss function.

也许是这样的:

class MyLayer(layers.Layer):
    def call(self, x):
        a = np.random.rand() # FIXME --> use tf.random
        self.a = a
        return x+a

input_layer = layers.Input()
my_layer = MyLayer(input_layer, name="my_layer")
output_layer = layers.Dense(4)(my_layer)
model = models.Model(inputs=input_layer, outputs=output_layer)

def my_loss(y_true, y_pred):
  y_true = y_true + my_layer.a
  return losses.mse(y_true, y_pred)


model.compile('adam', loss=my_loss)

这篇关于Keras中自定义丢失函数中的访问层属性的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆