发送与龙卷风多个异步发布请求 [英] send multiple async post request with tornado

查看:96
本文介绍了发送与龙卷风多个异步发布请求的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

关于龙卷风,stackoverflow有几个问题 我仍然没有找到问题的答案 我有一个很大的文本文件,希望对其进行迭代,并将每行作为POST http请求发送. 我希望做到这一点(我需要快速),然后检查请求的响应.

there are several questions on stackoverflow regarding tornado I still haven't found out an answer to my question I have a big text file that I wish to iterate on and send each line as a POST http request. I wish to do it async ( I need it to be fast) and then check the responses of the requests.

我有类似的东西

http_client = httpclient.AsyncHTTPClient()
with open(filename) as log_file:
    for line in log_file:
        request = httpclient.HTTPRequest(self.destination,method="POST",headers=self.headers,body=json.dumps(line))
        response = http_client.fetch(request, callback=self.handle_request)

看着tcpdump不会做任何事情 我得到的只是一个严肃的未来"对象 我还尝试将fetch命令放在"yield"中,然后在该方法上使用@ gen.coroutine装饰器时对其进行迭代. 那没有帮助. 有人可以告诉我我在做什么错吗?

looking at tcpdump this does not do anything all I get is a serious of "Futures" object I also tried placing the fetch command in "yield" and then iterating it while using the @gen.coroutine decorator on the method. that did not help. can anyone please tell me what am I doing wrong?

谢谢!

推荐答案

在协程中使用获取"的方法如下:

Here's how you'd use "fetch" in a coroutine:

from tornado import gen, httpclient, ioloop

filename = 'filename.txt'
destination = 'http://localhost:5000'
http_client = httpclient.AsyncHTTPClient()


@gen.coroutine
def post():
    with open(filename) as log_file:
        for line in log_file:
            request = httpclient.HTTPRequest(destination,
                                             body=line,
                                             method="POST")

            response = yield http_client.fetch(request)
            print response

ioloop.IOLoop.current().run_sync(post)

您可以在一个接收行并打印它们的小型服务器上对此进行测试:

You can test this with a little server that receives the lines and prints them:

from tornado import ioloop, web


class MyHandler(web.RequestHandler):
    def post(self):
        print self.request.body.rstrip()


app = web.Application([
    web.URLSpec('/', MyHandler)
])

app.listen(port=5000)
ioloop.IOLoop.current().start()

首先运行服务器代码,然后运行客户端.

First run the server code, and then the client.

如果您希望一次最多并行发布10条日志行,请安装 Toro 然后执行:

If you want to post up to 10 log lines at a time in parallel, install Toro and do:

from tornado import gen, ioloop
from tornado.httpclient import AsyncHTTPClient, HTTPRequest
from toro import JoinableQueue

filename = 'tox.ini'
destination = 'http://localhost:5000'
AsyncHTTPClient.configure("tornado.simple_httpclient.SimpleAsyncHTTPClient",
                          max_clients=10)

http_client = AsyncHTTPClient()
q = JoinableQueue(maxsize=10)


@gen.coroutine
def read():
    with open(filename) as log_file:
        for line in log_file:
            yield q.put(line)


@gen.coroutine
def post():
    while True:
        line = yield q.get()
        request = HTTPRequest(destination,
                              body=line,
                              method="POST")

        # Don't yield, just keep going as long as there's work in the queue.
        future = http_client.fetch(request)

        def done_callback(future):
            q.task_done()
            try:
                print future.result()
            except Exception as exc:
                print exc

        future.add_done_callback(done_callback)



# Start coroutines.
read()
post()

# Arrange to stop loop when queue is finished.
loop = ioloop.IOLoop.current()
join_future = q.join()


def done(future):
    loop.stop()

join_future.add_done_callback(done)

loop.start()

这篇关于发送与龙卷风多个异步发布请求的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆