将 Keras 模型集成到 TensorFlow 中 [英] Integrating Keras model into TensorFlow
问题描述
我正在尝试在 TensorFlow 代码中使用预训练的 Keras 模型,如 这篇 Keras 博客文章在第二部分:在 TensorFlow 中使用 Keras 模型.
I am trying to use a pre-trained Keras model within TensorFlow code, as described in this Keras blog post under section II: Using Keras models with TensorFlow.
我想使用 Keras 中可用的预训练 VGG16 网络从图像中提取卷积特征图,并在其上添加我自己的 TensorFlow 代码.所以我这样做了:
I want to use the pre-trained VGG16 network available in Keras to extract convolutional feature maps from images, and add my own TensorFlow code over that. So I've done this:
import tensorflow as tf
from tensorflow.python.keras.applications.vgg16 import VGG16, preprocess_input
from tensorflow.python.keras import backend as K
# images = a NumPy array containing 8 images
model = VGG16(include_top=False, weights='imagenet')
inputs = tf.placeholder(shape=images.shape, dtype=tf.float32)
inputs = preprocess_input(inputs)
features = model(inputs)
with tf.Session() as sess:
K.set_session(sess)
output = sess.run(features, feed_dict={inputs: images})
print(output.shape)
然而,这给了我一个错误:
However, this gives me an error:
FailedPreconditionError: Attempting to use uninitialized value block1_conv1_2/kernel
[[Node: block1_conv1_2/kernel/read = Identity[T=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:GPU:0"](block1_conv1_2/kernel)]]
[[Node: vgg16_1/block5_pool/MaxPool/_3 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device_incarnation=1, tensor_name="edge_132_vgg16_1/block5_pool/MaxPool", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:CPU:0"]()]]
相反,如果我在运行网络之前运行初始化操作:
Instead, if I run an initializer op before running the network:
with tf.Session() as sess:
K.set_session(sess)
tf.global_variables_initializer().run()
output = sess.run(features, feed_dict={inputs: images})
print(output.shape)
然后我得到预期的输出:
Then I get the expected output:
(8, 11, 38, 512)
我的问题是,在运行 tf.global_variables_initializer()
时,变量是随机初始化的还是使用 ImageNet 权重初始化的?我问这个是因为上面引用的博文没有提到使用预训练的 Keras 模型时需要运行初始化器,确实让我感到有些不安.
My question is, upon running tf.global_variables_initializer()
, have the variables been initialized randomly or with the ImageNet weights? I ask this because the blog post referenced above does not mention that an initializer needs to be run when using pre-trained Keras models, and indeed it makes me feel a bit uneasy.
我怀疑它确实使用了 ImageNet 权重,并且只需要运行初始化程序,因为 TensorFlow 要求显式初始化所有变量.但这只是猜测.
I suspect that it does use the ImageNet weights, and that one needs to run the initializer only because TensorFlow requires all variables to be explicitly initialized. But this is just a guess.
推荐答案
TLDR
使用 Keras 时,
TLDR
When using Keras,
- 尽可能避免使用
Session
(本着不可知论的 Keras 精神) - 否则使用 Keras 处理的
Session
到tf.keras.backend.get_session
. - 将 Keras 的
set_session
用于高级用途(例如,当您需要分析或设备放置时)并在程序的早期使用 — 与pure"中的常见做法和良好用法相反张量流.
- Avoid using
Session
if you can (in the spirit of agnostic Keras) - Use Keras-handled
Session
throughtf.keras.backend.get_session
otherwise. - Use Keras'
set_session
for advanced uses (e.g. when you need profiling or device placement) and very early in your program — contrary to common practice and good usage in "pure" Tensorflow.
更多相关信息
变量在使用前必须初始化.实际上,它比这更微妙:变量必须在在会话中初始化,它们被使用.让我们看看这个例子:
More about that
Variables must be initialized before they can be used. Actually, it's a bit more subtle than that: Variables must be initialized in the session they are used. Let's look at this example:
import tensorflow as tf
x = tf.Variable(0.)
with tf.Session() as sess:
tf.global_variables_initializer().run()
# x is initialized -- no issue here
x.eval()
with tf.Session() as sess:
x.eval()
# Error -- x was never initialized in this session, even though
# it has been initialized before in another session
因此,model
中的变量未初始化也就不足为奇了,因为您在 sess
之前创建了模型.
So it shouldn't come as a surprise that variables from your model
are not initialized, because you create your model before sess
.
然而,VGG16
不仅为模型变量(您使用 tf.global_variables_initializer
调用的那些)创建初始化操作,而且实际上确实 给他们打电话.问题是,在哪个Session
?
However, VGG16
not only creates initializer operations for the model variables (the ones you are calling with tf.global_variables_initializer
), but actually does call them. Question is, within which Session
?
好吧,由于在您构建模型时不存在任何模型,Keras 为您创建了一个默认模型,您可以使用 tf.keras.backend.get_session()
恢复该模型.使用此会话现在可以正常工作,因为变量已在此会话中初始化:
Well, since none existed at the time you built your model, Keras created a default one for you, that you can recover using tf.keras.backend.get_session()
. Using this session now works as expected because variables are initialized in this session:
with tf.keras.backend.get_session() as sess:
K.set_session(sess)
output = sess.run(features, feed_dict={inputs: images})
print(output.shape)
请注意,您也可以创建自己的 Session
并通过 keras.backend.set_session
将其提供给 Keras — 这正是您所做的.但是,正如这个例子所示,Keras 和 TensorFlow 有不同的心态.
Note that you could also create your own Session
and provide it to Keras, through keras.backend.set_session
— and this is exactly what you have done. But, as this example shows, Keras and TensorFlow have different mindsets.
TensorFlow 用户通常会首先构建一个图,然后实例化一个会话,也许在冻结图之后.
A TensorFlow user would typically first construct a graph, then instantiate a Session, perhaps after freezing the graph.
Keras 与框架无关,并且在构建阶段之间没有这种内在的区别——特别是,我们在这里了解到 Keras 可以很好地在图构建期间实例化会话.
Keras is framework-agnostic and does not have this built-in distinction between construction phases — in particular, we learned here that Keras may very well instantiate a Session during graph construction.
出于这个原因,在使用 Keras 时,我建议不要自己管理 tf.Session
,而是在需要时依赖 tf.keras.backend.get_session
处理需要 tf.Session
的 TensorFlow 特定代码.
For this reason, when using Keras, I would advise against managing a tf.Session
yourself and instead rely on tf.keras.backend.get_session
if you need to handle TensorFlow specific code that requires a tf.Session
.
这篇关于将 Keras 模型集成到 TensorFlow 中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!