发送同步请求python(一次全部) [英] Send Simultaneous Requests python (all at once)

查看:40
本文介绍了发送同步请求python(一次全部)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试创建一个脚本,可以同时向一个页面发送 1000 多个请求.但是请求带有线程 (1000) 个线程的库.似乎在 1 秒内完成了前 50 个左右的请求,而其他 9950 个则需要更长的时间.我是这样测量的.

I'm trying to create a script that send's over 1000 requests to one page at the same time. But requests library with threading (1000) threads. Seems to be doing to first 50 or so requests all within 1 second, whereas the other 9950 are taking considerably longer. I measured it like this.

def print_to_cmd(strinng):
    queueLock.acquire()
    print strinng
    queueLock.release()

    start = time.time()
    resp = requests.get('http://test.net/', headers=header)
    end = time.time()

    print_to_cmd(str(end-start))

我认为 requests 库限制了它们的发送速度.

I'm thinking requests library is limiting how fast they are getting sent.

有人知道在 python 中同时发送请求的方法吗?我有一个上传 200mb 的 VPS,所以这不是问题,它与 python 或请求库限制它有关.他们都需要在 1 秒内访问网站.

Doe's anybody know a way in python to send requests all at the same time? I have a VPS with 200mb upload so that is not the issue its something to do with python or requests library limiting it. They all need to hit the website within 1 second of each other.

感谢阅读,希望有人能提供帮助.

Thanks for reading and I hope somebody can help.

推荐答案

我通常发现最好的解决方案是使用像 tornado 这样的异步库.然而,我发现的最简单的解决方案是使用 ThreadPoolExecutor.

I have generally found that the best solution is to use an asynchronous library like tornado. The easiest solution that I found however is to use ThreadPoolExecutor.

import requests
from concurrent.futures import ThreadPoolExecutor

def get_url(url):
    return requests.get(url)
with ThreadPoolExecutor(max_workers=50) as pool:
    print(list(pool.map(get_url,list_of_urls)))

这篇关于发送同步请求python(一次全部)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆