将具有阻止请求的库重写为异步请求 [英] Rewrite a library with blocking requests into async ones
问题描述
有一个库在其内核中使用阻塞requests
,我想将其重写为异步版本,所以请您告知这样做的最佳/最简便策略是什么.
There is a library that uses blocking requests
in its core and I would like to rewrite it into asynchronous version, so could you please advise what would be the best/easiest strategy to do so.
在几个嵌套函数之后,整个库将调用一个函数:
The whole library, after several nested functions, calls one function:
def _send_http_request(self, url, payload, method='post', **kwargs):
# type: (Text, Optional[Text], Text, dict) -> Response
response = request(method=method, url=url, data=payload, **kwargs)
return response
仅将async
放在它前面是行不通的,因为它深深地嵌套在阻塞函数中.重写所有内容将太麻烦了.
我看了aiohttp
,trio
,asks
,有点迷路了,哪个更好.我知道celery
或dask
,但是我需要异步.
Just putting async
in front of it wont work since it is deeply nested in blocking functions. And rewriting everything would be a way too much hassle.
I had a look into aiohttp
, trio
, asks
and kinda got lost, which one is better. I know about celery
or dask
, but I need async.
推荐答案
您有几种选择:
Rewrite
_send_http_request
to be async (using, for example, aiohttp) and further rewrite all functions that use_send_http_request
to be async either. Yes, it's much work to do, but this is howasyncio
fundamentally designed.
仅包装需要使用此处.如果您不打算发出数百万个请求,则上述选项不会对性能产生太大影响,因为主要瓶颈仍然是I/O.否则,与纯asyncio
解决方案相比,线程开销将是明显的.
Wrap only top-level blocking functions (functions with I/O) you need to run asynchronously with run_in_executor as explained here. If you aren't going to make millions of requests you won't see much performance difference with option above since main bottleneck is still I/O. Otherwise threads overhead will be noticeable compared to pure asyncio
solution.
尝试其他解决方案,而不是asyncio
.例如, gevent 及其
Try other solution instead of asyncio
. For example, gevent and its monkey-patching. This approach has own pros and cons.
这篇关于将具有阻止请求的库重写为异步请求的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!