TensorFlow:如何使用 TensorHub 模块导出估算器? [英] TensorFlow: how to export estimator using TensorHub module?

查看:24
本文介绍了TensorFlow:如何使用 TensorHub 模块导出估算器?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个使用 TensorHub text_embedding 列的估算器,如下所示:

I have an estimator using a TensorHub text_embedding column, like so:

my_dataframe = pandas.DataFrame(columns=["title"})
# populate data
labels = [] 
# populate labels with 0|1
embedded_text_feature_column = hub.text_embedding_column(
    key="title" 
    ,module_spec="https://tfhub.dev/google/nnlm-en-dim128-with-normalization/1")


estimator = tf.estimator.LinearClassifier(
    feature_columns = [ embedded_text_feature_column ]
    ,optimizer=tf.train.FtrlOptimizer(
        learning_rate=0.1
        ,l1_regularization_strength=1.0
    )
    ,model_dir=model_dir
)

estimator.train(
    input_fn=tf.estimator.inputs.pandas_input_fn(
        x=my_dataframe
        ,y=labels
        ,batch_size=128
        ,num_epochs=None
        ,shuffle=True
        ,num_threads=5
    )
    ,steps=5000
)
export(estimator, "/tmp/my_model")

我如何导出和提供模型,以便它接受字符串作为预测的输入?我有一个 serving_input_receiver_fn 如下,并尝试了很多,但我很困惑它需要看起来像什么,以便我可以提供它(例如,使用saved_model_cli)并调用它以标题字符串(或简单的 JSON 结构)作为输入.

How can I export and serve the model so that it accepts strings as input to predictions? I have a serving_input_receiver_fn as follows, and tried quite a few more, but I'm quite confused as to what it needs to look like so that I can serve it (with saved_model_cli, say) and call it with title strings (or a simple JSON structure) as input.

def export(estimator, dir_path):
    def serving_input_receiver_fn():
        feature_spec = tf.feature_column.make_parse_example_spec(hub.text_embedding_column(
        key="title" 
        ,module_spec="https://tfhub.dev/google/nnlm-en-dim128-with-normalization/1"))
        return tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)


    estimator.export_savedmodel(
        export_dir_base=dir_path
        ,serving_input_receiver_fn=serving_input_receiver_fn()
    )

推荐答案

如果您想提供原始字符串,您可能需要考虑使用 原始输入接收器.这段代码:

If you want to feed raw strings, you might want to consider using the raw input receiver. This code:

feature_placeholder = {'title': tf.placeholder('string', [1], name='title_placeholder')}
serving_input_fn = tf.estimator.export.build_raw_serving_input_receiver_fn(feature_placeholder)

estimator.export_savedmodel(dir_path, serving_input_fn)

将根据 SavedModel CLI 为您提供具有以下输入规范的 SavedModel:

will give you a SavedModel with the following input specification according to the SavedModel CLI:

saved_model_cli show --dir ./ --tag_set serve --signature_def serving_default

The given SavedModel SignatureDef contains the following input(s):
  inputs['inputs'] tensor_info:
    dtype: DT_STRING
    shape: (-1)
    name: title_placeholder_1:0
The given SavedModel SignatureDef contains the following output(s):
  outputs['classes'] tensor_info:
    dtype: DT_STRING
    shape: (-1, 2)
    name: linear/head/Tile:0
  outputs['scores'] tensor_info:
    dtype: DT_FLOAT
    shape: (-1, 2)
    name: linear/head/predictions/probabilities:0

您可以向 CLI 提供一个 Python 表达式来为模型提供输入以验证它是否有效:

You can provide a python expression to the CLI to serve an input to the model to validate that it works:

saved_model_cli run --dir ./ --tag_set serve --signature_def \
serving_default --input_exprs "inputs=['this is a test sentence']"

Result for output key classes:
[[b'0' b'1']]
Result for output key scores:
[[0.5123377 0.4876624]]

这篇关于TensorFlow:如何使用 TensorHub 模块导出估算器?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆