Tensorflow 分类器.export_savedmodel(初学者) [英] Tensorflow classifier.export_savedmodel (Beginner)

查看:116
本文介绍了Tensorflow 分类器.export_savedmodel(初学者)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我知道服务 TensorFlow 模型"页面

I know about the "Serving a Tensorflow Model" page

https://www.tensorflow.org/serving/serving_basic

但是这些函数假设你使用的是 tf.Session() 而 DNNClassifier 教程没有......然后我查看了 DNNClassifier 的 api 文档,它有一个 export_savedmodel 函数(导出函数已弃用),看起来很简单,但我收到'NoneType' 对象不可迭代"错误......这应该意味着我正在传递一个空变量,但我不确定我需要改变什么......我基本上已经从 tensorflow.org 上的 get_started/tflearn 页面复制并粘贴代码,然后添加

but those functions assume you're using tf.Session() which the DNNClassifier tutorial does not... I then looked at the api doc for DNNClassifier and it has an export_savedmodel function (the export function is deprecated) and it seems simple enough but I am getting a "'NoneType' object is not iterable" error... which is suppose to mean I'm passing in an empty variable but I'm unsure what I need to change... I've essentially copied and pasted the code from the get_started/tflearn page on tensorflow.org but then added

  directoryName = "temp"

  def serving_input_fn():
    print("asdf")

  classifier.export_savedmodel(
    directoryName,
    serving_input_fn
  )

就在classifier.fit函数调用之后……export_savedmodel的其他参数是可选的我相信……有什么想法吗?

just after the classifier.fit function call... the other parameters for export_savedmodel are optional I believe... any ideas?

代码教程:https://www.tensorflow.org/get_started/tflearn#construct_a_deep_neural_network_classifier

export_savedmodel 的 API 文档https://www.tensorflow.org/api_docs/python/tf/contrib/learn/DNNClassifier#export_savedmodel

API Doc for export_savedmodel https://www.tensorflow.org/api_docs/python/tf/contrib/learn/DNNClassifier#export_savedmodel

推荐答案

TensorFlow 应用有两种:

There are two kind of TensorFlow applications:

  • 假设您使用的是 tf.Session() 的函数是来自低级"Tensorflow 示例的函数,并且
  • DNNClassifier 教程是一个高级"Tensorflow 应用程序.
  • The functions that assume you are using tf.Session() are functions from "low level" Tensorflow examples, and
  • the DNNClassifier tutorial is a "high level" Tensorflow application.

我将解释如何导出高级"Tensorflow 模型(使用 export_savedmodel).

I'm going to explain how to export "high level" Tensorflow models (using export_savedmodel).

函数export_savedmodel 需要参数serving_input_receiver_fn,这是一个没有参数的函数,它定义了来自模型和预测器的输入.因此,您必须创建自己的serving_input_receiver_fn,其中模型输入类型与训练脚本中的模型输入匹配,预测器输入类型与测试脚本中的预测器输入匹配.

The function export_savedmodel requires the argument serving_input_receiver_fn, that is a function without arguments, which defines the input from the model and the predictor. Therefore, you must create your own serving_input_receiver_fn, where the model input type match with the model input in the training script, and the predictor input type match with the predictor input in the testing script.

另一方面,如果您创建自定义模型,则必须定义 export_outputs,由函数 tf.estimator.export.PredictOutput 定义,其输入是定义必须与测试脚本中预测器输出名称匹配的名称的字典.

On the other hand, if you create a custom model, you must define the export_outputs, defined by the function tf.estimator.export.PredictOutput, which input is a dictionary that define the name that has to match with the name of the predictor output in the testing script.

例如:

def serving_input_receiver_fn():
    serialized_tf_example = tf.placeholder(dtype=tf.string, shape=[None], name='input_tensors')
    receiver_tensors      = {"predictor_inputs": serialized_tf_example}
    feature_spec          = {"words": tf.FixedLenFeature([25],tf.int64)}
    features              = tf.parse_example(serialized_tf_example, feature_spec)
    return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)

def estimator_spec_for_softmax_classification(logits, labels, mode):
    predicted_classes = tf.argmax(logits, 1)
    if (mode == tf.estimator.ModeKeys.PREDICT):
        export_outputs = {'predict_output': tf.estimator.export.PredictOutput({"pred_output_classes": predicted_classes, 'probabilities': tf.nn.softmax(logits)})}
        return tf.estimator.EstimatorSpec(mode=mode, predictions={'class': predicted_classes, 'prob': tf.nn.softmax(logits)}, export_outputs=export_outputs) # IMPORTANT!!!
    onehot_labels = tf.one_hot(labels, 31, 1, 0)
    loss          = tf.losses.softmax_cross_entropy(onehot_labels=onehot_labels, logits=logits)
    if (mode == tf.estimator.ModeKeys.TRAIN):
        optimizer = tf.train.AdamOptimizer(learning_rate=0.01)
        train_op  = optimizer.minimize(loss, global_step=tf.train.get_global_step())
        return tf.estimator.EstimatorSpec(mode, loss=loss, train_op=train_op)
    eval_metric_ops = {'accuracy': tf.metrics.accuracy(labels=labels, predictions=predicted_classes)}
    return tf.estimator.EstimatorSpec(mode=mode, loss=loss, eval_metric_ops=eval_metric_ops)

def model_custom(features, labels, mode):
    bow_column           = tf.feature_column.categorical_column_with_identity("words", num_buckets=1000)
    bow_embedding_column = tf.feature_column.embedding_column(bow_column, dimension=50)   
    bow                  = tf.feature_column.input_layer(features, feature_columns=[bow_embedding_column])
    logits               = tf.layers.dense(bow, 31, activation=None)
    return estimator_spec_for_softmax_classification(logits=logits, labels=labels, mode=mode)

def main():
    # ...
    # preprocess-> features_train_set and labels_train_set
    # ...
    classifier     = tf.estimator.Estimator(model_fn = model_custom)
    train_input_fn = tf.estimator.inputs.numpy_input_fn(x={"words": features_train_set}, y=labels_train_set, batch_size=batch_size_param, num_epochs=None, shuffle=True)
    classifier.train(input_fn=train_input_fn, steps=100)
    full_model_dir = classifier.export_savedmodel(export_dir_base="C:/models/directory_base", serving_input_receiver_fn=serving_input_receiver_fn)

测试脚本

def main():
    # ...
    # preprocess-> features_test_set
    # ...
    with tf.Session() as sess:
        tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], full_model_dir)
        predictor   = tf.contrib.predictor.from_saved_model(full_model_dir)
        model_input = tf.train.Example(features=tf.train.Features( feature={"words": tf.train.Feature(int64_list=tf.train.Int64List(value=features_test_set)) })) 
        model_input = model_input.SerializeToString()
        output_dict = predictor({"predictor_inputs":[model_input]})
        y_predicted = output_dict["pred_output_classes"][0]

(在 Python 3.6.3、Tensorflow 1.4.0 中测试的代码)

(Code tested in Python 3.6.3, Tensorflow 1.4.0)

这篇关于Tensorflow 分类器.export_savedmodel(初学者)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆