如何在异步协程中包装同步函数? [英] How can I wrap a synchronous function in an async coroutine?

查看:135
本文介绍了如何在异步协程中包装同步函数?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用 aiohttp 来构建API服务器,该服务器将TCP请求发送到单独的服务器服务器。发送TCP请求的模块是同步的,对于我来说是一个黑匣子。所以我的问题是这些请求阻止了整个API。我需要一种将模块请求包装在异步协程中的方法,该协程不会阻塞API的其余部分。

I'm using aiohttp to build an API server that sends TCP requests off to a seperate server. The module that sends the TCP requests is synchronous and a black box for my purposes. So my problem is that these requests are blocking the entire API. I need a way to wrap the module requests in an asynchronous coroutine that won't block the rest of the API.

因此,只需使用 sleep 作为一个简单的示例,有什么方法可以将耗时的同步代码包装在一个非阻塞的协程中,就像这样:

So, just using sleep as a simple example, is there any way to somehow wrap time-consuming synchronous code in a non-blocking coroutine, something like this:

async def sleep_async(delay):
    # After calling sleep, loop should be released until sleep is done
    yield sleep(delay)
    return 'I slept asynchronously'


推荐答案

最终我在此线程。我正在寻找的方法是 run_in_executor 。这允许同步函数异步运行而不会阻塞事件循环。

Eventually I found an answer in this thread. The method I was looking for is run_in_executor. This allows a synchronous function to be run asynchronously without blocking an event loop.

在上面发布的 sleep 示例中,则可能看起来像这样:

In the sleep example I posted above, it might look like this:

import asyncio
from time import sleep

async def sleep_async(loop, delay):
    # None uses the default executor (ThreadPoolExecutor)
    await loop.run_in_executor(None, sleep, delay)
    return 'I slept asynchronously'

另请参阅以下答案->

Also see the following answer -> How do we call a normal function where a coroutine is expected?

这篇关于如何在异步协程中包装同步函数?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆