如何在keras模型中使用tf操作 [英] how to use tf operations in keras models

查看:209
本文介绍了如何在keras模型中使用tf操作的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在keras模型中进行张量流操作,我对该机制以及Lambda层对tf张量所做的事情感到非常困惑.

I am trying to us tensorflow operations within a keras model and I am quite confused about the mechanism and what Lambda layers do to tf tensors.

这可行:

a = keras.layers.Input(shape=[1, 2], dtype='float', name='a')
s= keras.layers.Lambda(lambda x: tf.transpose(tf.transpose(x)))(a)
model = keras.models.Model(inputs=a, outputs=s)

但这不起作用:

a = keras.layers.Input(shape=[1, 2], dtype='float', name='a')
s = tf.transpose(tf.transpose(a))
s = keras.layers.Lambda(lambda x: x)(s)
model = keras.models.Model(inputs=a, outputs=s)

它说:

AttributeError: 'Tensor' object has no attribute '_keras_history'

那么总是需要在一个层中打包tf操作吗?

so is it always necessary to pack up tf operations within a layer?

问题2(为什么我要提出上一个问题):我们是否必须打包一个自定义图层才能在keras中进行矩阵乘法?

Question 2 (was why I came up the previous one): do we have to pack with a custom layer to do matrix multiplication in keras?

谢谢.

推荐答案

问题1:是的,有必要用一层包装tf操作,因为keras模型需要tensorflow ops不包含的某些函数/变量.在这种情况下,_keras_history是仅通过将op包裹一层而产生的属性.

Question 1: Yes, it is necessary to wrap tf operations with a layer, because keras models require certain functions/variables that aren't included with tensorflow ops. In this case, _keras_history is a property that is only produced by wrapping the op with a layer.

问题2:您是否考虑过使用keras Dense层和use_bias=False进行矩阵乘法?如果您想使用一个常量作为权重向量,则可以设置kernel_initializer={constant}trainable=False.

Question 2: Is the matrix multiplication traHave you considered using a keras Dense layer, with use_bias=False? If you want to use a constant for the weight vector, you could set the kernel_initializer={constant}, and trainable=False.

这篇关于如何在keras模型中使用tf操作的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆