每批Keras都会获得模型输出 [英] Keras get model outputs after each batch

查看:311
本文介绍了每批Keras都会获得模型输出的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用生成器为分层递归模型制作顺序训练数据,该模型需要前一批的输出来生成下一批的输入.这与Keras参数stateful=True相似,它保存了下一批的隐藏状态,只是它更复杂,所以我不能按原样使用它.

I'm using a generator to make sequential training data for a hierarchical recurrent model, which needs the outputs of the previous batch to generate the inputs for the next batch. This is a similar situation to the Keras argument stateful=True which saves the hidden states for the next batch, except it's more complicated so I can't just use that as-is.

到目前为止,我尝试过在损失函数中加入一些技巧:

So far I tried putting a hack in the loss function:

def custom_loss(y_true, y_pred):
    global output_ref
    output_ref[0] = y_pred[0].eval(session=K.get_session())
    output_ref[1] = y_pred[1].eval(session=K.get_session())

但是没有编译,我希望有更好的方法. Keras回调会有所帮助吗?

but that didn't compile and I hope there's a better way. Will Keras callbacks be of any help?

推荐答案

这里:

model.compile(optimizer='adam')
# hack after compile
output_layers = [ 'gru' ]
s_name = 's'
model.metrics_names += [s_name]
model.metrics_tensors += [layer.output for layer in model.layers if layer.name in output_layers]

class my_callback(Callback):
    def on_batch_end(self, batch, logs=None):
        s_pred = logs[s_name]
        print('s_pred:', s_pred)
        return

model.fit(..., callbacks=[my_callback()])

这篇关于每批Keras都会获得模型输出的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆