烧瓶发送流作为响应 [英] Flask send stream as response

查看:49
本文介绍了烧瓶发送流作为响应的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试用另一台服务器(服务器#02)代理"我的Flask服务器(我将其称为服务器#01).除一件事外,它运行良好:当Server#01使用send_from_directory()时,我不知道如何重新发送该文件.

我的经典代理人"

  result = requests.get(my_path_to_server01)返回Response(stream_with_context(result.iter_content()),content_type = result.headers ['Content-Type']) 

有了文件响应,这需要几个小时...所以我尝试了很多事情.工作的人是:

  result = requests.get(my_path_to_server01,stream = True)使用open('img.png','wb')作为out_file:shutil.copyfileobj(result.raw,out_file)返回send_from_directory('./','img.png') 

我想重定向"我的响应(结果"变量),或发送/复制我的文件流.无论如何,我不想使用物理文件,因为在我看来这似乎不是正确的方式,因此我可以想象到所有可能因此而发生的问题.

解决方案

经典"代理应该没有任何问题,除了它应该使用 stream = True 并指定一个< response.iter_content()的code> chunk_size .

默认情况下, chunk_size 为1个字节,因此流式传输效率非常低,因此非常慢.尝试更大的块大小,例如10K应该会产生更快的传输速度.这是代理的一些代码.

 导入请求从烧瓶导入烧瓶,响应,stream_with_contextapp = Flask(__ name__)my_path_to_server01 ='http://localhost:5000/'@ app.route("/")def streamed_proxy():r = request.get(my_path_to_server01,stream = True)返回响应(r.iter_content(chunk_size = 10 * 1024),content_type = r.headers ['Content-Type'])如果__name__ =="__main__":app.run(端口= 1234) 

您甚至不需要在这里使用 stream_with_context(),因为您不需要访问 iter_content()返回的生成器中的请求上下文./p>

I'm trying to "proxy" my Flask server (i will call it Server#01) with another server(Server#02). It's working well except for one thing : when the Server#01 use send_from_directory(), i don't know how to re-send this file.

My classic "proxy"

result = requests.get(my_path_to_server01)
return Response(stream_with_context(result.iter_content()), 
                content_type = result.headers['Content-Type'])

With a file a response, it's taking hours... So i tried many things. The one who work is :

result = requests.get(my_path_to_server01, stream=True)

with open('img.png', 'wb') as out_file:
    shutil.copyfileobj(result.raw, out_file)

return send_from_directory('./', 'img.png')

I would like to "redirect" my response ("result" variable), or send/copy a stream of my file. Anyways I don't want to use a physical file because it don't seems the proper way in my mind and i can imagine all problems who can happens because of that.

解决方案

There should not be any problem with your "classic" proxy other than that it should use stream=True, and specify a chunk_size for response.iter_content().

By default chunk_size is 1 byte, so the streaming will be very inefficient and consequently very slow. Trying a larger chunk size, e.g. 10K should yield faster transfers. Here's some code for the proxy.

import requests
from flask import Flask, Response, stream_with_context

app = Flask(__name__)

my_path_to_server01 = 'http://localhost:5000/'

@app.route("/")
def streamed_proxy():
    r = requests.get(my_path_to_server01, stream=True)
    return Response(r.iter_content(chunk_size=10*1024),
                    content_type=r.headers['Content-Type'])

if __name__ == "__main__":
    app.run(port=1234)

You don't even need to use stream_with_context() here because you don't need access to the request context within the generator returned by iter_content().

这篇关于烧瓶发送流作为响应的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆