图表已断开连接:无法获取张量张量输入Keras Python的值 [英] Graph disconnected: cannot obtain value for tensor Tensor Input Keras Python

查看:139
本文介绍了图表已断开连接:无法获取张量张量输入Keras Python的值的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有此代码:

# Declare the layers
inp1 = Input(shape=input_shape, name="input1")
inp2 = Input(shape=input_shape, name="input2")


# 128 -> 64
conv1_inp1 = Conv2D(start_neurons * 1, 3, activation="relu", padding="same")(inp1)
conv1_inp2 = Conv2D(start_neurons * 1, 3, activation="relu", padding="same")(inp2)
conv1 = Concatenate()([conv1_inp1, conv1_inp2])
conv1 = Conv2D(start_neurons * 1, 3, activation="relu", padding="same")(conv1)
conv1 = MaxPooling2D((2, 2))(conv1)
conv1 = Dropout(0.25)(conv1)

# 64 -> 32
conv2 = Conv2D(start_neurons * 2, (3, 3), activation="relu", padding="same")(conv1)
conv2 = Conv2D(start_neurons * 2, (3, 3), activation="relu", padding="same")(conv2)
pool2 = MaxPooling2D((2, 2))(conv2)
pool2 = Dropout(0.5)(pool2)

# 32 -> 16
conv3 = Conv2D(start_neurons * 4, (3, 3), activation="relu", padding="same")(pool2)
conv3 = Conv2D(start_neurons * 4, (3, 3), activation="relu", padding="same")(conv3)
pool3 = MaxPooling2D((2, 2))(conv3)
pool3 = Dropout(0.5)(pool3)

# 16 -> 8
conv4 = Conv2D(start_neurons * 8, (3, 3), activation="relu", padding="same")(pool3)
conv4 = Conv2D(start_neurons * 8, (3, 3), activation="relu", padding="same")(conv4)
pool4 = MaxPooling2D((2, 2))(conv4)
pool4 = Dropout(0.5)(pool4)

# Middle
convm = Conv2D(start_neurons * 16, (3, 3), activation="relu", padding="same")(pool4)
convm = Conv2D(start_neurons * 16, (3, 3), activation="relu", padding="same")(convm)

# 8 -> 16
deconv4 = Conv2DTranspose(start_neurons * 8, (3, 3), strides=(2, 2), padding="same")(convm)
uconv4 = Concatenate()([deconv4, conv4])
uconv4 = Dropout(0.5)(uconv4)
uconv4 = Conv2D(start_neurons * 8, (3, 3), activation="relu", padding="same")(uconv4)
uconv4 = Conv2D(start_neurons * 8, (3, 3), activation="relu", padding="same")(uconv4)

# 16 -> 32
deconv3 = Conv2DTranspose(start_neurons * 4, (3, 3), strides=(2, 2), padding="same")(uconv4)
uconv3 = Concatenate()([deconv3, conv3])
uconv3 = Dropout(0.5)(uconv3)
uconv3 = Conv2D(start_neurons * 4, (3, 3), activation="relu", padding="same")(uconv3)
uconv3 = Conv2D(start_neurons * 4, (3, 3), activation="relu", padding="same")(uconv3)

# 32 -> 64
deconv2 = Conv2DTranspose(start_neurons * 2, (3, 3), strides=(2, 2), padding="same")(uconv3)
uconv2 = Conv2D(start_neurons * 2, (3, 3), activation="relu", padding="same")(uconv2)
uconv2 = Conv2D(start_neurons * 2, (3, 3), activation="relu", padding="same")(uconv2)

# 64 -> 128
deconv1 = Conv2DTranspose(start_neurons * 1, (3, 3), strides=(2, 2), padding="same")(uconv2)
uconv1 = Conv2D(start_neurons * 1, (3, 3), activation="relu", padding="same")(deconv1)
uconv1 = Conv2D(start_neurons * 1, (3, 3), activation="relu", padding="same")(uconv1)

uncov1 = Dropout(0.5)(uconv1)
output_layer = Conv2D(1, (1,1), padding="same", activation="sigmoid")(uconv1)



# Declare the model and add the layers
model = Model(inputs = [inp1, inp2], outputs = output_layer)

model.summary()
model.compile(optimizer='adam',loss='binary_crossentropy')

它会产生此错误:

Graph disconnected: cannot obtain value for tensor Tensor("input_28:0", shape=(?, 128, 128, 1), dtype=float32) at layer "input_28". The following previous layers were accessed without issue: []

输入的形状相同,在某些论坛中,他们说问题出在以下事实:输入来自2个不同的来源,因此破坏了您以前的链接.

The inputs have the same shape and in some forums, they say that the problem comes from the fact that the inputs are coming from 2 different sources therefore breaking the link that you had before.

我真的不知道该如何解决.

I don't really know how to fix that.

有人可以帮助我吗?

谢谢.

推荐答案

这是图形断开连接的地方(调用时未定义uconv2):

This is where your graph is disconnected (uconv2 is not defined when you are calling it):

# 32 -> 64
deconv2 = Conv2DTranspose(start_neurons * 2, (3, 3), strides=(2, 2), padding="same")(uconv3)
uconv2 = Conv2D(start_neurons * 2, (3, 3), activation="relu", padding="same")(uconv2)

这篇关于图表已断开连接:无法获取张量张量输入Keras Python的值的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆