如何在预测之间保持 tensorflow 会话打开?从 SavedModel 加载 [英] How to keep tensorflow session open between predictions? Loading from SavedModel

查看:24
本文介绍了如何在预测之间保持 tensorflow 会话打开?从 SavedModel 加载的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我训练了一个 tensorflow 模型,我想用它从 numpy 数组运行预测.这是用于视频中的图像处理.我会在图像发生时将图像传递给模型.并非每一帧都通过.

I trained a tensorflow model that i'd like to run predictions on from numpy arrays. This is for image processing within videos. I will pass the images to the model as they happen. Not every frame is passed.

重新加载 我的 SavedModel 在像这样的会话中

I reload my SavedModel within a session like so

def run(self):                
    with tf.Session(graph=tf.Graph()) as sess:
        tf.saved_model.loader.load(sess,
                    [tf.saved_model.tag_constants.SERVING], "model")

如果我将图像列表 (self.tfimages) 传递给预测,我的代码就可以完美运行.浓缩为:

My code works perfectly if I pass a list of images (self.tfimages) to the prediction. Condensed to:

    softmax_tensor = sess.graph.get_tensor_by_name('final_ops/softmax:0')
    predictions = sess.run(softmax_tensor, {'Placeholder:0': self.tfimages})

但我不会一次拥有所有图像.我真的每次都必须从文件中重新加载模型吗(需要 2 分钟以上).

But i won't have all the images at once. Do I really have to reload the model from file each time (takes 2+ minutes).

我想做这样的事情

class tensorflow_model:
def __init__(self):                
    with tf.Session(graph=tf.Graph()) as self.sess:
        tf.saved_model.loader.load(self.sess,
                    [tf.saved_model.tag_constants.SERVING], "model")
def predict(self):

        # Feed the image_data as input to the graph and get first prediction
        softmax_tensor = self.sess.graph.get_tensor_by_name('final_ops/softmax:0')

        predictions = self.sess.run(softmax_tensor, {'Placeholder:0': self.tfimages})

但这会产生

builtins.RuntimeError: 尝试使用关闭的会话

builtins.RuntimeError: Attempted to use a closed Session

有没有办法让会话保持打开状态,或者可以独立于会话加载 SavedModel?

Is there a way to keep a session open, or perhaps load SavedModel independent of a session?

编辑我尝试了第一个答案,分两步创建会话:

EDIT I tried the first answer to create a session in two steps:

sess=tf.Session(graph=tf.Graph())
sess
<tensorflow.python.client.session.Session object at 0x0000021ACBB62EF0>
tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
Traceback (most recent call last):
  Debug Probe, prompt 138, line 1
  File "C:\Program Files\Python35\Lib\site-packages\tensorflow\python\saved_model\loader_impl.py", line 222, in load
    saver.restore(sess, variables_path)
  File "C:\Program Files\Python35\Lib\site-packages\tensorflow\python\training\saver.py", line 1428, in restore
    {self.saver_def.filename_tensor_name: save_path})
  File "C:\Program Files\Python35\Lib\site-packages\tensorflow\python\client\session.py", line 774, in run
    run_metadata_ptr)
  File "C:\Program Files\Python35\Lib\site-packages\tensorflow\python\client\session.py", line 905, in _run
    raise RuntimeError('The Session graph is empty.  Add operations to the '
builtins.RuntimeError: The Session graph is empty.  Add operations to the graph before calling run().

with tf.Session(graph=tf.Graph()) as sess:
    tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")

执行没有错误.

至于将sess作为变量传递给类的第二个想法,这是一个很好的想法.这有效:

As for the second idea of passing sess as a variable to class, which is a good one. This works:

with tf.Session(graph=tf.Graph()) as sess:
    tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
    tensorflow_instance=tensorflow(read_from="file")
    tensorflow_instance.predict(sess)

但这没有

sess=tf.Session(graph=tf.Graph())
tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
tensorflow_instance=tensorflow(read_from="file")
tensorflow_instance.predict(sess)

将我的程序包装到 with as sess 语句中会很尴尬.

It would be pretty awkward to wrap my program into the with as sess statement.

完整代码:

import tensorflow as tf
import sys
from google.protobuf import text_format
from tensorflow.core.framework import graph_pb2
import os
import glob

class tensorflow:    

def __init__(self,read_from):

    #frames to be analyzed
    self.tfimages=[]    

    find_photos=glob.glob("*.jpg")

    # Read in the image_data
    if read_from=="file":
        for x in find_photos:
            image_data = tf.gfile.FastGFile(x, 'rb').read()    
            self.tfimages.append(image_data)

    # Loads label file, strips off carriage return
    self.label_lines = [line.rstrip() for line in tf.gfile.GFile("dict.txt")]

def predict(self,sess):

    # Feed the image_data as input to the graph and get first prediction
    softmax_tensor = sess.graph.get_tensor_by_name('final_ops/softmax:0')

    predictions = sess.run(softmax_tensor, {'Placeholder:0': self.tfimages})
    for prediction in predictions:
        # Sort to show labels of first prediction in order of confidence
        top_k = prediction.argsort()[-len(prediction):][::-1]

        for node_id in top_k:
            human_string = self.label_lines[node_id]
            score = prediction[node_id]
            print('%s (score = %.5f)' % (human_string, score))
        return(human_string)

if __name__ == "__main__":
    with tf.Session(graph=tf.Graph()) as sess:
        tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
        tensorflow_instance=tensorflow(read_from="file")
        tensorflow_instance.predict(sess)

    sess=tf.Session(graph=tf.Graph())
    tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
    tensorflow_instance=tensorflow(read_from="file")
    tensorflow_instance.predict(sess)

推荐答案

其他人已经解释了为什么不能将会话放在构造函数中的 with 语句中.

Others have explained why you can't put your session in a with statement in the constructor.

您在使用上下文管理器和不使用上下文管理器时看到不同行为的原因是因为 tf.saved_model.loader.load 在默认图形和作为会话一部分的图形之间存在一些奇怪的交互.

The reason you see different behavior when using the context manager vs. not is because tf.saved_model.loader.load has some weird interactions between the default graph and the graph that is part of the session.

解决办法很简单;如果您不在 with 块中使用它,请不要将图形传递给会话:

The solution is simple; don't pass a graph to session if you're not using it in a with block:

sess=tf.Session()
tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")

以下是一些用于类进行预测的示例代码:

Here's some example code for a class to do predictions:

class Model(object):

  def __init__(self, model_path):
    # Note, if you don't want to leak this, you'll want to turn Model into
    # a context manager. In practice, you probably don't have to worry
    # about it.
    self.session = tf.Session()

    tf.saved_model.loader.load(
        self.session,
        [tf.saved_model.tag_constants.SERVING],
        model_path)

    self.softmax_tensor = self.session.graph.get_tensor_by_name('final_ops/softmax:0')

  def predict(self, images):
    predictions = self.session.run(self.softmax, {'Placeholder:0': images})
    # TODO: convert to human-friendly labels
    return predictions


images = [tf.gfile.FastGFile(f, 'rb').read() for f in glob.glob("*.jpg")]
model = Model('model_path')
print(model.predict(images))

# Alternatively (uses less memory, but has lower throughput):
for f in glob.glob("*.jpg"):
  print(model.predict([tf.gfile.FastGFile(f, 'rb').read()]))

这篇关于如何在预测之间保持 tensorflow 会话打开?从 SavedModel 加载的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆