龙卷风烧瓶关闭等待 [英] tornado flask close wait
问题描述
当我使Web应用程序使用 torado + flask 时,我遇到一个问题,当我向应用程序发送请求时,它对我什么都没有响应,并且始终等待着.当我发现问题时,我发现我的服务器计算机(linux)中有许多关闭等待" .我不知道如何解决这个问题,有人可以帮我吗?这是我使用龙卷风的代码:
when i make a web application use torado+flask ,i meet a question that when i send a request to my application,it response nothing to me ,and its always await. when i find the problem,i found that there is many 'close wait' in my server machine(linux). i don't know how to resolve this question,can anybody help me? here is the code i use tornado:
#coding=utf-8
from tornado.wsgi import WSGIContainer
from tornado.httpserver import HTTPServer`enter code here`
from tornado.ioloop import IOLoop
from service import app #app is a flask in another file:app=Flask(__name__)
from config import SERVER_CONF
from appLog import logging
def startService():
logging.info('start web,http://%s:%s/test'%(SERVER_CONF['IP'],SERVER_CONF['PORT']))
try:
http_server=HTTPServer(WSGIContainer(app))
http_server.listen(SERVER_CONF['PORT'],address=SERVER_CONF['IP'])
IOLoop.instance().start()
except Exception as e:
logging.error('start failed:')
logging.error(e)
if __name__=='__main__':
startService()
推荐答案
我的理解是,您需要使用 FallbackHandler
,如
My understanding is that you need to use FallbackHandler
, as described in this answer.
话虽如此,我强烈建议您不要使用这种方法-Tornado包含出色的自己的微框架,它与服务器的集成度更高,并且在许多方面都优于Flask.如果使用Flask很重要,我建议您探索一种不同的方式来确保并发性(例如,在nginx负载均衡器后面的多个实例),或者甚至看看 Sanic ,它既异步又与Flask非常相似.
That being said, I would strongly recommend against using this approach -- Tornado includes an excellent microframework of its own, which integrates much better with the server and is in many ways superior to Flask. If using Flask is important I would recommend exploring a different ways to ensure concurrency (e.g. multiple instances behind a nginx load balancer), or even taking a look at Sanic, which is both asynchronous and very similar to Flask.
这篇关于龙卷风烧瓶关闭等待的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!