Keras后端建模问题 [英] Keras Backend Modeling Issue

查看:139
本文介绍了Keras后端建模问题的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在声明模型时遇到问题.我的输入是x_input和y_input,我的输出是预测.如下:

I am having an issue declaring my model. My inputs are x_input and y_input, and my outputs are predictions. As follows:

model = Model(inputs = [x_input, y_input], outputs = predictions )

我的输入(x,y)都被嵌入,然后MatMult一起被嵌入.如下:

My inputs (x,y) are both embedded, then MatMult together. As follows:

# Build X Branch
x_input = Input(shape = (maxlen_x,), dtype = 'int32' )                               
x_embed = Embedding( maxvocab_x + 1, 16, input_length = maxlen_x )
XE = x_embed(x_input) 
# Result: Tensor("embedding_1/Gather:0", shape=(?, 31, 16), dtype=float32)
# Where 31 happens to be my maxlen_x

类似于y分支...

# Build Y Branch
y_input = Input(shape = (maxlen_y,), dtype = 'int32' )                               
y_embed = Embedding( maxvocab_y + 1, 16, input_length = maxlen_y )
YE = y_embed(y_input) 
# Result: Tensor("embedding_1/Gather:0", shape=(?, 13, 16), dtype=float32)
# Where 13 happens to be my maxlen_y

然后我在两者之间做一个批处理点. (只需点入每个实例中的数据)

I then do a batch dot between the two. (Simply dotting the data from each instance)

from keras import backend as K
dot_merged = K.batch_dot(XE, YE, axes=[2,2] ) # Choose the 2nd component of both inputs to Dot, using batch_dot 
# Result: Tensor("MatMul:0", shape=(?, 31, 13), dtype=float32)`

然后我将张量的最后两个维度展平.

I then flattened the last two dimensions of the tensor.

dim = np.prod(list(dot_merged.shape)[1:]) 
flattened= K.reshape(dot_merged, (-1,int(dim)) )

最终,我将这些展平的数据输入到一个简单的逻辑回归器中.

Ultimately, I fed this flattened data into a simple logistic regressor.

predictions = Dense(1,activation='sigmoid')(flattened)

而且,我的预测当然是我对该模型的输出.

And, my predictions are, of course, my output for the model.

我将通过张量的输出形状列出每一层的输出.

I will list the output of each layer by the output shape of the tensor.

Tensor("embedding_1/Gather:0", shape=(?, 31, 16), dtype=float32)
Tensor("embedding_2/Gather:0", shape=(?, 13, 16), dtype=float32)
Tensor("MatMul:0", shape=(?, 31, 13), dtype=float32)
Tensor("Reshape:0", shape=(?, 403), dtype=float32)
Tensor("dense_1/Sigmoid:0", shape=(?, 1), dtype=float32)

我特别收到以下错误.

    Traceback (most recent call last):
  File "Model.py", line 53, in <module>
    model = Model(inputs = [dx_input, rx_input], outputs = [predictions] )
  File "/Users/jiangq/tensorflow/lib/python3.6/site-packages/keras/legacy/interfaces.py", line 88, in wrapper
    return func(*args, **kwargs)
  File "/Users/jiangq/tensorflow/lib/python3.6/site-packages/keras/engine/topology.py", line 1705, in __init__
    build_map_of_graph(x, finished_nodes, nodes_in_progress)
  File "/Users/jiangq/tensorflow/lib/python3.6/site-packages/keras/engine/topology.py", line 1695, in build_map_of_graph
    layer, node_index, tensor_index)
  File "/Users/jiangq/tensorflow/lib/python3.6/site-packages/keras/engine/topology.py", line 1665, in build_map_of_graph
    layer, node_index, tensor_index = tensor._keras_history
AttributeError: 'Tensor' object has no attribute '_keras_history'

Volia.我哪里做错了? 感谢您提前提供的帮助!

Volia. Where did I go wrong? Thanks for any help ahead of time!

-安东尼

推荐答案

您是否尝试过将后端函数包装到Lambda层中? 我认为Keras层的__call__()方法中有一些必要的操作,可以正确地构建Keras Model,如果直接调用后端函数,则将不会执行这些操作.

Did you tried wrapping the backend functions into a Lambda layer? I think there are some necessary operations within a Keras layer's __call__() method for a Keras Model to be properly built, which will not be executed if you call the backend functions directly.

这篇关于Keras后端建模问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆