Python:如何同时发送多个 http 请求?(像叉子) [英] Python: How can i send multiple http requests at the same time? (like fork)

查看:32
本文介绍了Python:如何同时发送多个 http 请求?(像叉子)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

假设我有一种方法可以将 http 请求发送到服务器.如何同时将这些请求中的两个(或更多)发送到服务器?例如,也许通过 fork 一个进程?我该怎么做?(我也在使用 django)

#这个例子没有经过测试...进口请求定义测试员(请求):server_url = 'http://localhost:9000/receive'有效载荷 = {'d_test2': '1234','d_test2': '演示',}json_payload = simplejson.dumps(payload)content_length = len(json_payload)headers = {'Content-Type': 'application/json', 'Content-Length': content_length}response = requests.post(server_url, data=json_payload, headers=headers, allow_redirects=True)如果 response.status_code == requests.codes.ok:打印 'Headers: {}
Response: {}'.format(response.headers, response.text)

谢谢!

解决方案

我认为您希望在这里使用线程而不是分叉新进程.虽然线程在某些情况下很糟糕,但在这里并非如此.另外,我认为您想使用 concurrent.futures 而不是直接使用线程(或进程).

例如,假设您有 10 个网址,并且您目前正在连续使用它们,如下所示:

results = map(tester, urls)

但是现在,您想一次发送 2 个.只需将其更改为:

以 concurrent.futures.ThreadPoolExecutor(max_workers=2) 作为池:结果 = pool.map(tester, urls)

如果您想一次尝试 4 个而不是 2 个,只需更改 max_workers.事实上,您可能应该尝试不同的值,看看哪种值最适合您的程序.

如果你想做一些更高级的事情,请参阅文档——主要的 ThreadPoolExecutor Example 几乎正是您要找的.

不幸的是,在 2.7 中,这个模块没有标准库,所以你必须安装 来自 PyPI 的向后移植.

如果您安装了 pip,这应该就这么简单:

pip 安装期货

... 或者 sudo pip install futures,在 Unix 上.

如果您没有 pip,请先获取(点击上面的链接).

<小时>

有时您想使用进程而不是线程的主要原因是您有大量 CPU 密集型计算,并且您想利用多个 CPU 内核.在 Python 中,线程无法有效地使用所有内核.因此,如果任务管理器/活动监视器/任何显示您的程序在一个内核上使用了 100% CPU,而其他内核都为 0%,那么进程就是答案.使用 futures,您只需将 ThreadPoolExecutor 更改为 ProcessPoolExecutor.

<小时>

与此同时,有时您需要的不仅仅是给我一个神奇的工人池来运行我的任务".有时你想运行一些很长的作业而不是一堆小的作业,或者自己对作业进行负载平衡,或者在作业之间传递数据,或者其他什么.为此,您想使用 multiprocessingthreading 而不是 futures.

很少,甚至那个也太高级了,直接告诉Python创建一个新的子进程或线程.为此,您一直到 os.fork(仅在 Unix 上)或 thread.

Let's say that i have a way to send http request to a server. How it's possible to send two of these requests (or more) to the server at the same time? For example maybe by fork a process? How can i do it? (also i'm using django)

#This example is not tested...
import requests

def tester(request):
    server_url = 'http://localhost:9000/receive'

    payload = {
        'd_test2': '1234',
        'd_test2': 'demo',
        }
    json_payload = simplejson.dumps(payload)
    content_length = len(json_payload)

    headers = {'Content-Type': 'application/json', 'Content-Length': content_length}
    response = requests.post(server_url, data=json_payload, headers=headers, allow_redirects=True)

    if response.status_code == requests.codes.ok:
        print 'Headers: {}
Response: {}'.format(response.headers, response.text)

Thanks!

解决方案

I think you want to use threads here rather than forking off new processes. While threads are bad in some cases, that isn't true here. Also, I think you want to use concurrent.futures instead of using threads (or processes) directly.

For example, let's say you have 10 URLs, and you're currently doing them one in a row, like this:

results = map(tester, urls)

But now, you want to send them 2 at a time. Just change it to this:

with concurrent.futures.ThreadPoolExecutor(max_workers=2) as pool:
    results = pool.map(tester, urls)

If you want to try 4 at a time instead of 2, just change the max_workers. In fact, you should probably experiment with different values to see what works best for your program.

If you want to do something a little fancier, see the documentation—the main ThreadPoolExecutor Example is almost exactly what you're looking for.

Unfortunately, in 2.7, this module doesn't come with the standard library, so you will have to install the backport from PyPI.

If you have pip installed, this should be as simple as:

pip install futures

… or maybe sudo pip install futures, on Unix.

And if you don't have pip, go get it first (follow the link above).


The main reason you sometimes want to use processes instead of threads is that you've got heavy CPU-bound computation, and you want to take advantage of multiple CPU cores. In Python, threading can't effectively use up all your cores. So, if the Task Manager/Activity Monitor/whatever shows that your program is using up 100% CPU on one core, while the others are all at 0%, processes are the answer. With futures, all you have to do is change ThreadPoolExecutor to ProcessPoolExecutor.


Meanwhile, sometimes you need more than just "give me a magic pool of workers to run my tasks". Sometimes you want to run a handful of very long jobs instead of a bunch of little ones, or load-balance the jobs yourself, or pass data between jobs, or whatever. For that, you want to use multiprocessing or threading instead of futures.

Very rarely, even that is too high-level, and directly tell Python to create a new child process or thread. For that, you go all the way down to os.fork (on Unix only) or thread.

这篇关于Python:如何同时发送多个 http 请求?(像叉子)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆