将 Keras 模型导出为 TF Estimator:找不到训练好的模型 [英] Exporting a Keras model as a TF Estimator: trained model cannot be found
问题描述
在尝试将 Keras 模型导出为 TensorFlow Estimator 以便为模型提供服务时,我遇到了以下问题.由于同样的问题也出现了 在对这个问题的回答 中,我将说明在一个玩具示例中会发生什么,并提供我的解决方案以用于文档目的.Tensorflow 1.12.0 和 Keras 2.2.4 会出现此行为.实际的 Keras 和 tf.keras
都会发生这种情况.
当尝试使用 tf.keras.estimator.model_to_estimator
导出从 Keras 模型创建的 Estimator 时会出现问题.调用 estimator.export_savedmodel
时,会抛出 NotFoundError
或 ValueError
.
以下代码为一个玩具示例重现了这一点.
创建一个 Keras 模型并保存:
导入keras模型 = keras.Sequential()模型.添加(keras.layers.Dense(单位=1,激活='sigmoid',input_shape=(10, )))model.compile(loss='binary_crossentropy', 优化器='sgd')model.save('./model.h5')
接下来,使用tf.keras.estimator.model_to_estimator
将模型转换为estimator,添加输入接收器函数,并使用以
:Savedmodel
格式导出estimator.export_savedmodel
# 将 keras 模型转换为 TF estimatortf_files_path = './tf'估计量 = f.keras.estimator.model_to_estimator(keras_model=model,model_dir=tf_files_path)defserving_input_receiver_fn():返回 tf.estimator.export.build_raw_serving_input_receiver_fn({model.input_names[0]: tf.placeholder(tf.float32, shape=[None, 10])})# 导出估计器export_path = './export'estimator.export_savedmodel(出口路径,serving_input_receiver_fn=serving_input_receiver_fn())
这会抛出:
ValueError: 无法在 ./tf.
我的解决方案如下.检查 ./tf
文件夹清楚地表明对 model_to_estimator
的调用将必要的文件存储在 keras
子文件夹中,而 export_model
期望这些文件直接位于 ./tf
文件夹中,因为这是我们为 model_dir
参数指定的路径:
$ 树 ./tf./tf└── 凯拉斯├──检查站├── keras_model.ckpt.data-00000-of-00001├── keras_model.ckpt.index└── keras_model.ckpt.meta1个目录,4个文件
简单的解决方法是将这些文件向上移动一个文件夹.这可以用 Python 来完成:
导入操作系统进口木材从 pathlib 导入路径def up_one_dir(路径):"""将path中的所有文件向上移动一个文件夹,并删除空文件夹"""parent_dir = str(Path(path).parents[0])对于 os.listdir(path) 中的 f:Shutil.move(os.path.join(path, f), parent_dir)关闭.rmtree(路径)up_one_dir('./tf/keras')
这将使 model_dir
目录看起来像这样:
$ 树 ./tf./tf├──检查站├── keras_model.ckpt.data-00000-of-00001├── keras_model.ckpt.index└── keras_model.ckpt.meta0 个目录,4 个文件
在 model_to_estimator
和 export_savedmodel
调用之间执行此操作允许根据需要导出模型:
export_path = './export'estimator.export_savedmodel(出口路径,serving_input_receiver_fn=serving_input_receiver_fn())
<块引用>
INFO:tensorflow:SavedModel 写入:./export/temp-b'1549796240'/saved_model.pb
I encountered the following issue when trying to export a Keras model as a TensorFlow Estimator with the purpose of serving the model. Since the same problem also popped up in an answer to this question, I will illustrate what happens on a toy example and provide my workaround solution for documentation purposes. This behaviour occurs with Tensorflow 1.12.0 and Keras 2.2.4. This happens with actual Keras as well as with tf.keras
.
The problem occurs when trying to export an Estimator that was created from a Keras model with tf.keras.estimator.model_to_estimator
. Upon calling estimator.export_savedmodel
, either a NotFoundError
or a ValueError
is thrown.
The below code reproduces this for a toy example.
Create a Keras model and save it:
import keras
model = keras.Sequential()
model.add(keras.layers.Dense(units=1,
activation='sigmoid',
input_shape=(10, )))
model.compile(loss='binary_crossentropy', optimizer='sgd')
model.save('./model.h5')
Next, convert the model to an estimator with tf.keras.estimator.model_to_estimator
, add an input receiver function and export it in the Savedmodel
format with estimator.export_savedmodel
:
# Convert keras model to TF estimator
tf_files_path = './tf'
estimator =
tf.keras.estimator.model_to_estimator(keras_model=model,
model_dir=tf_files_path)
def serving_input_receiver_fn():
return tf.estimator.export.build_raw_serving_input_receiver_fn(
{model.input_names[0]: tf.placeholder(tf.float32, shape=[None, 10])})
# Export the estimator
export_path = './export'
estimator.export_savedmodel(
export_path,
serving_input_receiver_fn=serving_input_receiver_fn())
This will throw:
ValueError: Couldn't find trained model at ./tf.
My workaround solution is as follows. Inspecting the ./tf
folder makes clear that the call to model_to_estimator
stored the necessary files in a keras
subfolder, while export_model
expects those files to be in the ./tf
folder directly, as this is the path we specified for the model_dir
argument:
$ tree ./tf
./tf
└── keras
├── checkpoint
├── keras_model.ckpt.data-00000-of-00001
├── keras_model.ckpt.index
└── keras_model.ckpt.meta
1 directory, 4 files
The simple workaround is to move these files up one folder. This can be done with Python:
import os
import shutil
from pathlib import Path
def up_one_dir(path):
"""Move all files in path up one folder, and delete the empty folder
"""
parent_dir = str(Path(path).parents[0])
for f in os.listdir(path):
shutil.move(os.path.join(path, f), parent_dir)
shutil.rmtree(path)
up_one_dir('./tf/keras')
Which will make the model_dir
directory look like this:
$ tree ./tf
./tf
├── checkpoint
├── keras_model.ckpt.data-00000-of-00001
├── keras_model.ckpt.index
└── keras_model.ckpt.meta
0 directories, 4 files
Doing this manipulation in between the model_to_estimator
and the export_savedmodel
calls allows to export the model as desired:
export_path = './export'
estimator.export_savedmodel(
export_path,
serving_input_receiver_fn=serving_input_receiver_fn())
INFO:tensorflow:SavedModel written to: ./export/temp-b'1549796240'/saved_model.pb
这篇关于将 Keras 模型导出为 TF Estimator:找不到训练好的模型的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!