将巨大的Keras模型加载到Flask应用中 [英] Loading a huge Keras Model into a Flask App

查看:480
本文介绍了将巨大的Keras模型加载到Flask应用中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在构建一个小的Flask应用程序,该应用程序在后台使用卷积神经网络对用户上传的图像进行预测.如果我这样加载它,它将起作用:

I'm building a small Flask app that uses a convolutional neural net behind the scenes to make predictions on user uploaded images. It works if I load it like this:

@app.route("/uploader", methods=["GET","POST"])
def get_image():
    if request.method == 'POST':
        f = request.files['file']
        sfname = 'static/'+str(secure_filename(f.filename))
        f.save(sfname)
        clf = catdog.classifier()
        return render_template('result.html', pred = clf.predict(sfname), imgpath = sfname)

但是,这要求在用户添加图像之后加载分类器(clf).这需要一段时间,因为它需要根据泡菜文件为200层以上的神经网络设置所有权重.

However, this requires the the classifier (clf) be loaded after the user adds the image. This takes a while, as it requires setting up all weights for a 200+ layer neural network from a pickle file.

我想做的是在生成应用程序时加载所有权重.为此,我尝试了此操作(为HTML模板/导入/应用启动切出了不相关的代码):

What I want to do is load all the weights when the app is spawned. To do this, I've tried this (cutting out unrelated code for HTML templates/imports/app launch):

# put model into memory on spawn
clf = catdog.classifier()
# Initialize the app
app = flask.Flask(__name__)

@app.route("/uploader", methods=["GET","POST"])
def get_image():
    if request.method == 'POST':
        f = request.files['file']
        sfname = 'static/'+str(secure_filename(f.filename))
        f.save(sfname)
        return render_template('result.html', pred = clf.predict(sfname), imgpath = sfname)

当我这样做时,我得到了该追溯(跳过所有烧瓶特定的踪迹在顶部):

When I do this, I get this traceback (skipping all the flask specific traces at the top):

 File "/Users/zachariahmiller/Documents/Metis/test_area/flask_catdog/flask_backend.py", line 26, in get_image
    return render_template('result.html', pred = clf.predict(sfname), imgpath = sfname)
  File "/Users/zachariahmiller/Documents/Metis/test_area/flask_catdog/catdog.py", line 56, in predict
    prediction = self.model.predict(img_to_predict, batch_size=1, verbose=1)
  File "/Users/zachariahmiller/anaconda/lib/python2.7/site-packages/keras/engine/training.py", line 1569, in predict
    self._make_predict_function()
  File "/Users/zachariahmiller/anaconda/lib/python2.7/site-packages/keras/engine/training.py", line 1037, in _make_predict_function
    **kwargs)
  File "/Users/zachariahmiller/anaconda/lib/python2.7/site-packages/keras/backend/tensorflow_backend.py", line 2095, in function
    return Function(inputs, outputs, updates=updates)
  File "/Users/zachariahmiller/anaconda/lib/python2.7/site-packages/keras/backend/tensorflow_backend.py", line 2049, in __init__
    with tf.control_dependencies(self.outputs):
  File "/Users/zachariahmiller/anaconda/lib/python2.7/site-packages/tensorflow/python/framework/ops.py", line 3583, in control_dependencies
    return get_default_graph().control_dependencies(control_inputs)
  File "/Users/zachariahmiller/anaconda/lib/python2.7/site-packages/tensorflow/python/framework/ops.py", line 3314, in control_dependencies
    c = self.as_graph_element(c)
  File "/Users/zachariahmiller/anaconda/lib/python2.7/site-packages/tensorflow/python/framework/ops.py", line 2405, in as_graph_element
    return self._as_graph_element_locked(obj, allow_tensor, allow_operation)
  File "/Users/zachariahmiller/anaconda/lib/python2.7/site-packages/tensorflow/python/framework/ops.py", line 2484, in _as_graph_element_locked
    raise ValueError("Tensor %s is not an element of this graph." % obj)
ValueError: Tensor Tensor("dense_2/Softmax:0", shape=(?, 2), dtype=float32) is not an element of this graph.

我不确定为什么将特定调用之外的分类器作为全局对象加载到应用程序会导致失败.它应该在内存中,并且我已经看过其他人使用SKLearn分类器执行此操作的示例.关于为什么会导致此错误的任何想法?

I'm not sure why loading the classifier outside of the specific call as a global object to the app makes it fail. It should be in memory, and I've seen other examples of folks doing this with SKLearn Classifiers. Any ideas on why this causes this error?

推荐答案

你好,我遇到了同样的问题.

Hello I had the same problem.

我正在将我的python服务器作为threaded = True运行.删除它,让我的工作

I was running my python server as threaded=True. Removing this, lets mine work

app.run(host='0.0.0.0', port=5000, threaded=True)

---->

app.run(host='0.0.0.0', port=5000)

调试似乎对我没有任何影响

Debug didn't seem to affect anything for me

这篇关于将巨大的Keras模型加载到Flask应用中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆