提供可变长度输入的 TF 模型 [英] Serving TF model with Variable length input

查看:47
本文介绍了提供可变长度输入的 TF 模型的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用 TF Serving 为我的 TF 模型提供服务.这是我的模型输入:

I am trying to serve my TF model with TF Serving. Here's the model's input I have:

raw_feature_spec = {
     'x': tf.io.VarLenFeature(tf.string),
     'y': tf.io.VarLenFeature(tf.string),
     'z': tf.io.FixedLenFeature([], tf.string)
}

然后使用 TF Transform with Beam 将输入对象转换为具有形状 x:(None, 20, 100), y:(None, 20, 5), z:(None,3) 的对象,它们适用于初始不包含 transform_funс(转换图)的模型.然后我导出我的模型:

Then input object gets transformed using TF Transform with Beam into object with shapes x:(None, 20, 100), y:(None, 20, 5), z:(None,3), which are applicable for initial model without transform_funс (Transform Graph) included. Then I'm exporting my model with:

 import tensorflow as tf
 import tensorflow_transform as tft

 tf_transform_output = tft.TFTransformOutput('saved_transform_graph_folder')     
 estimator = tf.keras.estimator.model_to_estimator(keras_model_path='model_folder')
 estimator.export_saved_model('OUTPUT_MODEL_NAME',  make_serving_input_fn(tf_transform_output))

def make_serving_input_fn(tf_transform_output):
 raw_feature_spec = {
       'x': tf.io.VarLenFeature(tf.string),
       'y': tf.io.VarLenFeature(tf.string),
       'z': tf.io.FixedLenFeature([], tf.string)
   }

def serving_input_fn():
    raw_input_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(raw_feature_spec)
    raw_reatures = raw_input_fn().features
    features = {
        'x': tf.sparse.to_dense(raw_reatures["x"]),
        'y':  tf.sparse.to_dense(raw_reatures["y"]),
        'z': raw_reatures["z"]
    }

    # Apply the transform function that was used to generate the materialized data
    transformed_features = tf_transform_output.transform_raw_features(raw_reatures)
    return tf.estimator.export.ServingInputReceiver(transformed_features, features)

return serving_input_fn

其中 transform_func 是一些 func,它将输入张量重塑为所需的张量并包含在 tf_transform_output 对象中.因此,当我使用 Docker Hub 中的 TFS 映像通过此代码提供导出的模型时,将 HTTP GET 请求发送到/model/metadata 我得到:

Where transform_func is some func, which reshapes input tensors to the ones needed and is included into tf_transform_output object. So when I serve exported model by this code using TFS image from Docker Hub, sending HTTP GET request to /model/metadata I'm getting:

{
"model_spec": {
    "name": "newModel",
    "signature_name": "",
    "version": "1579786077"
},
"metadata": {
    "signature_def": {
        "signature_def": {
            "serving_default": {
                "inputs": {
                    "x": {
                        "dtype": "DT_STRING",
                        "tensor_shape": {
                            "dim": [
                                {
                                    "size": "-1",
                                    "name": ""
                                },
                                {
                                    "size": "-1",
                                    "name": ""
                                }
                            ],
                            "unknown_rank": false
                        },
                        "name": "SparseToDense_1:0"
                    },
                    "y": {
                        "dtype": "DT_STRING",
                        "tensor_shape": {
                            "dim": [
                                {
                                    "size": "-1",
                                    "name": ""
                                },
                                {
                                    "size": "-1",
                                    "name": ""
                                }
                            ],
                            "unknown_rank": false
                        },
                        "name": "SparseToDense:0"
                    },
                    "z": {
                        "dtype": "DT_STRING",
                        "tensor_shape": {
                            "dim": [
                                {
                                    "size": "-1",
                                    "name": ""
                                }
                            ],
                            "unknown_rank": false
                        },
                        "name": "ParseExample/ParseExample:6"
                    }
                },
                "outputs": {
                    "main_output": {
                        "dtype": "DT_FLOAT",
                        "tensor_shape": {
                            "dim": [
                                {
                                    "size": "-1",
                                    "name": ""
                                },
                                {
                                    "size": "20",
                                    "name": ""
                                }
                            ],
                            "unknown_rank": false
                        },
                        "name": "main_output/Softmax:0"
                    }
                },
                "method_name": "tensorflow/serving/predict"
            }
        }
    }
}

所以输入是正确的(尽管如此,我在导出时使用了 tf.sparse.to_dense 来封装 VarLenFeature).但是当我将 HTTP POST 请求发送到/model:predict with body:

So the inputs are correct (nevertheless I used tf.sparse.to_dense for casing of VarLenFeature while exporting). But when I send HTTP POST request to /model:predict with body:

{ 
   "instances": 
   [
     {
        "x": ["text","text","text","text","text","text"],
        "y": ["test","test","test","test","test","test"],
        "z": "str"
     }
  ]
}

我收到一个错误

{
    "error": "You must feed a value for placeholder tensor \'input_example_tensor\' with dtype string and shape [?]\n\t [[{{node input_example_tensor}}]]"
}

有谁知道我做错了什么或如何正确创建变量输入?我需要张量形状,就像我现在在元数据中一样,所以我不需要通过序列化原型示例访问 API 的能力,只需通过原始张量.TF 版本:2.0,TF Serving 和 TF Transform - 最新版本.

Does anyone have an idea what am I doing wrong or how to create Variable input correctly? I need Tensors shapes as I now have in metadata, so I don't need an ability to access API by serialized proto Example, just by raw tensors. TF version: 2.0, TF Serving and TF Transform - last versions.

PS 另外,我尝试使用 tf.keras.backend.placeholderbuild_raw_serving_input_receiver_fn 调用导出模型,所以会有在serving_input_fn 中没有从稀疏张量转换为密集张量,但结果相同.

P.S. Also, I've tried to export a model using tf.keras.backend.placeholder with build_raw_serving_input_receiver_fn call so there would be no casting from sparse to dense tensor in serving_input_fn, but results are the same.

推荐答案

您在 transformed_features 中有一个名为 input_example_tensor 的占位符.而且 TensorFlow 的稀疏占位符也不起作用.

You have a placeholder named input_example_tensor in the transformed_features. And also the sparse placeholder of TensorFlow doesn't work.

我通过在接收器中用 3 个密集张量(即索引、值和形状)表示每个稀疏张量来解决这个问题,然后将它们转换为特征中的稀疏张量.对于您的情况,您需要将 serving_input_fn 定义如下:

I solve this problem by representing each sparse tensor with 3 dense tensors in the receiver, namely indices, values, and shape then convert them to sparse tensor in features. For your case, you need to define the serving_input_fn as following:


def serving_input_fn():
    inputs = {
        "x_indices": tf.placeholder(tf.int64, [None, 2]),
        "x_vals": tf.placeholder(tf.string, [None, 2]),
        "x_shape": tf.placeholder(tf.int64, [2]),
        "y_indices": tf.placeholder(tf.int64, [None, 2]),
        "y_vals": tf.placeholder(tf.string, [None, 2]),
        "y_shape": tf.placeholder(tf.int64, [2]),
        "z": tf.placeholder(tf.string, [None, 1])
    }

    fvs = {
        "x": tf.SparseTensor(
                 inputs["x_indices"],
                 inputs["x_vals"],
                 inputs[x_shape]
             ),
        "y": tf.SparseTensor(
                 inputs["y_indices"],
                 inputs["y_vals"],
                 inputs[y_shape]
             ),
        "z": inputs["z"]
    }
    
    return tf.estimator.export.ServingInputReceiver(fvs, inputs)

我还想问你一个问题,你是如何用 TensorFlow Keras 训练一个具有 VarLenFeature 的模型的,你能和我分享这部分吗?目前,我正在使用 TensorFlow Estimator 训练这种模型.

I also want to ask you a question, how did you manage to train a model that has VarLenFeature with TensorFlow Keras, could you share this part with me? Currently, I'm training this kind of models with TensorFlow Estimator.

这篇关于提供可变长度输入的 TF 模型的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆