龙卷风是否可以并发? [英] Is concurrency possible in tornado?

查看:86
本文介绍了龙卷风是否可以并发?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我了解龙卷风是单线程且非阻塞的服务器,因此请求是按顺序处理的(使用事件驱动方法进行IO操作时除外)。

I understand tornado is a single threaded and non-Blocking server, hence requests are handled sequentially (except when using event driven approach for IO operation).

一种在龙卷风中并行处理多个请求以进行正常(非IO)执行的方法。我无法分叉多个进程,因为我需要跨请求使用一个公共内存空间。

Is there a way to process multiple requests parallel in tornado for normal(non-IO) execution. I can't fork multiple process since I need a common memory space across requests.

如果无法实现,请向我建议其他可以处理并行请求的Python服务器,支持wsgi。

If its not possible please suggest to me other python servers which can handle parallel request and also supports wsgi.

推荐答案

如果您确实要处理多个受计算限制的同时请求,而您想在Python中执行此操作,则需要一个多进程服务器,而不是多线程。 CPython具有全局解释器锁定(GIL),可以防止多个线程同时执行python字节码。

If you are truly going to be dealing with multiple simultaneous requests that are compute-bound, and you want to do it in Python, then you need a multi-process server, not multi-threaded. CPython has Global Interpreter Lock (GIL) that prevents more than one thread from executing python bytecode at the same time.

大多数Web应用程序只执行很少的计算,而是在等待从数据库,磁盘或其他服务器上的服务进行I / O操作。在丢弃龙卷风之前,请确保您需要处理受计算限制的请求。

Most web applications do very little computation, and instead are waiting for I/O, either from the database, or the disk, or from services on other servers. Be sure you need to handle compute-bound requests before discarding Tornado.

这篇关于龙卷风是否可以并发?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆