Python的asyncio同步工作 [英] Python's asyncio works synchronously

查看:184
本文介绍了Python的asyncio同步工作的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试利用Python的新asyncio库发送异步HTTP请求.在发送每个请求之前,我想等待几毫秒(timeout变量),但是,当然,它们都是异步发送的,而不是在发送每个请求后都等待响应.

I'm trying to leverage Python's new asyncio library to send asynchronous HTTP requests. I want to wait a few miliseconds (the timeout variable) before sending each request - but of course - send them all asynchronously and, not wait for a response after each request sent.

我正在执行以下操作:

@asyncio.coroutine
def handle_line(self, line, destination):
    print("Inside! line {} destination {}".format(line, destination))
    response = yield from aiohttp.request('POST', destination, data=line,
                               headers=tester.headers)
    print(response.status)
    return (yield from response.read())

@asyncio.coroutine
def send_data(self, filename, timeout):
    destination='foo'
    logging.log(logging.DEBUG, 'sending_data')
    with open(filename) as log_file:
        for line in log_file:
            try:
                json_event = json.loads(line)
            except ValueError as e:
                print("Error parsing json event")
            time.sleep(timeout)
            yield from asyncio.async(self.handle_line(json.dumps(json_event), destination))


loop=asyncio.get_event_loop().run_until_complete(send_data('foo.txt', 1))

我得到的输出(通过打印200个响应)看起来像这段代码正在同步运行.我在做什么错了?

The output that I am getting (by printing the 200 responses) looks like this code is running synchronously. What am I doing wrong?

推荐答案

这里有两个问题:

  1. 您应该使用 asyncio.sleep ,而不是,因为后者将阻止事件循环.

  1. You should use asyncio.sleep, not time.sleep, because the latter will block the event loop.

asyncio.async(self.handle_line(...))调用之后不应该使用yield from,因为这会导致脚本阻塞,直到self.handle_line协程完成为止,这意味着您最终不会同时执行任何操作.您处理每一行,等待处理完成,然后继续进行下一行.相反,您应该立即运行所有asyncio.async调用,将返回的Task对象保存到列表中,然后使用

You shouldn't be using yield from after the asyncio.async(self.handle_line(...)) call, because that will make the script block until the self.handle_line coroutine is complete, which means you don't end up doing anything concurrently; you process each line, wait for the processing to complete, then move on to the next line. Instead, you should run all the asyncio.async calls without waiting, save the Task objects returned to a list, and then use asyncio.wait to wait for them all to complete once you've started them all.

将它们放在一起:

@asyncio.coroutine
def handle_line(self, line, destination):
    print("Inside! line {} destination {}".format(line, destination))
    response = yield from aiohttp.request('POST', destination, data=line,
                               headers=tester.headers)
    print(response.status)
    return (yield from response.read())

@asyncio.coroutine
def send_data(self, filename, timeout):
    destination='foo'
    logging.log(logging.DEBUG, 'sending_data')
    tasks = []
    with open(filename) as log_file:
        for line in log_file:
            try:
                json_event = json.loads(line)
            except ValueError as e:
                print("Error parsing json event")
            yield from asyncio.sleep(timeout)
            tasks.append(asyncio.async(
                 self.handle_line(json.dumps(json_event), destination))
    yield from asyncio.wait(tasks)


asyncio.get_event_loop().run_until_complete(send_data('foo.txt', 1))

这篇关于Python的asyncio同步工作的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆