如何为张量流模型推理输入多特征 [英] how to input multi features for tensorflow model inference
问题描述
我正在尝试建模服务测试.
现在,我正在关注这个例子 "https://www.tensorflow.org/beta/guide/saved_model"
I'm trying to model serving test.
Now, I'm following this example "https://www.tensorflow.org/beta/guide/saved_model"
这个例子没问题.但是,就我而言,我有多输入功能.
This example is OK. But, In my case, I have multi input features.
loaded = tf.saved_model.load(export_path)
infer = loaded.signatures["serving_default"]
print(infer.structured_input_signature)
=> ((), {'input1': TensorSpec(shape=(None, 1), dtype=tf.int32, name='input1'), 'input2': TensorSpec(shape=(None, 1), dtype=tf.int32, name='input2')})
例如,对于单个输入特征,只需输入像
In example, for single input features, just input feature like
infer(tf.constant(x))
就我而言,对于多输入特征,如何输入特征??
我使用的是 tensorflow 2.0 beta 和 python3.5.
In my case, for multi input features, How to input features??
I'm using tensorflow 2.0 beta and python3.5.
推荐答案
我解决了这个问题.
在单输入特征模型中,infer._num_positional_args
赋值为 1.
但是,多输入特征模型 infer._num_positional_args
分配了 0.我不知道为什么.
我是这样解决的.
I solve this problem.
In single input feature model, infer._num_positional_args
assigned 1.
But, multi input features model infer._num_positional_args
assigned 0.
I don't know why.
I solve like this.
infer._num_positional_args = 2
infer(tf.constant(x1), tf.constant(x2)
用于使用请求
import json
import requests
data = json.dumps({"signature_name": "serving_default", "instances": [{'input1':[x1], 'input2':[x2]}]})
headers = {"content-type": "application/json"}
json_response = requests.post('http://localhost:8501/v1/models/model:predict', data=data, headers=headers)
对于saved_model_cli
For saved_model_cli
!saved_model_cli run --dir $export_path --tag_set serve --signature_def serving_default \
--input_exprs 'inptu1=[[x1]];input2=[[x2]]'
这篇关于如何为张量流模型推理输入多特征的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!