为什么协程不能与run_in_executor一起使用? [英] Why coroutines cannot be used with run_in_executor?

查看:858
本文介绍了为什么协程不能与run_in_executor一起使用?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想运行一个使用协程和多线程请求URL的服务.但是,我不能将协程传递给执行者中的工人.有关此问题的最小示例,请参见下面的代码:

I want to run a service that requests urls using coroutines and multithread. However I cannot pass coroutines to the workers in the executor. See the code below for a minimal example of this issue:

import time
import asyncio
import concurrent.futures

EXECUTOR = concurrent.futures.ThreadPoolExecutor(max_workers=5)

async def async_request(loop):
    await asyncio.sleep(3)

def sync_request(_):
    time.sleep(3)

async def main(loop):
    futures = [loop.run_in_executor(EXECUTOR, async_request,loop) 
               for x in range(10)]

    await asyncio.wait(futures)

loop = asyncio.get_event_loop()
loop.run_until_complete(main(loop))

导致以下错误:

Traceback (most recent call last):
  File "co_test.py", line 17, in <module>
    loop.run_until_complete(main(loop))
  File "/usr/lib/python3.5/asyncio/base_events.py", line 387, in run_until_complete
    return future.result()
  File "/usr/lib/python3.5/asyncio/futures.py", line 274, in result
    raise self._exception
  File "/usr/lib/python3.5/asyncio/tasks.py", line 239, in _step
    result = coro.send(None)
  File "co_test.py", line 10, in main
    futures = [loop.run_in_executor(EXECUTOR, req,loop) for x in range(10)]
  File "co_test.py", line 10, in <listcomp>
    futures = [loop.run_in_executor(EXECUTOR, req,loop) for x in range(10)]
  File "/usr/lib/python3.5/asyncio/base_events.py", line 541, in run_in_executor
    raise TypeError("coroutines cannot be used with run_in_executor()")
TypeError: coroutines cannot be used with run_in_executor()

我知道我可以使用sync_request函数而不是async_request,在这种情况下,我可以通过将阻塞函数发送到另一个线程来获得协程.

I know that I could use sync_request funcion instead of async_request, in this case I would have coroutines by means of sending the blocking function to another thread.

我也知道我可以在事件循环中调用async_request十次.如下代码所示:

I also know I could call async_request ten times in the event loop. Something like in the code below:

loop = asyncio.get_event_loop()
futures = [async_request(loop) for i in range(10)]
loop.run_until_complete(asyncio.wait(futures))

但是在这种情况下,我将使用单个线程.

But in this case I would be using a single thread.

我如何使用这两种方案,协程在多线程中工作?从代码中可以看到,我正在将(而不使用)pool传递给async_request,希望我可以编写一些代码来告诉工作人员创造未来,将其发送到池中并异步进行(释放).工人)等待结果.

How could I use both scenarios, the coroutines working within multithreads? As you can see by the code, I am passing (and not using) the pool to the async_request in the hopes I can code something that tells the worker to make a future, send it to the pool and asynchronously (freeing the worker) waits for the result.

我要这样做的原因是使应用程序可伸缩.这是不必要的步骤吗?我应该简单地为每个网址提供一个线程,就是这样吗?像这样:

The reason I want to do that is to make the application scalable. Is it an unnecessary step? Should I simply have a thread per url and that is it? Something like:

LEN = len(list_of_urls)
EXECUTOR = concurrent.futures.ThreadPoolExecutor(max_workers=LEN)

足够好吗?

推荐答案

您必须在线程上下文中创建并设置新的事件循环才能运行协程:

You have to create and set a new event loop in the thread context in order to run coroutines:

import asyncio
from concurrent.futures import ThreadPoolExecutor


def run(corofn, *args):
    loop = asyncio.new_event_loop()
    try:
        coro = corofn(*args)
        asyncio.set_event_loop(loop)
        return loop.run_until_complete(coro)
    finally:
        loop.close()


async def main():
    loop = asyncio.get_event_loop()
    executor = ThreadPoolExecutor(max_workers=5)
    futures = [
        loop.run_in_executor(executor, run, asyncio.sleep, 1, x)
        for x in range(10)]
    print(await asyncio.gather(*futures))
    # Prints: [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]


if __name__ == '__main__':
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())

这篇关于为什么协程不能与run_in_executor一起使用?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆