python请求的理想块大小 [英] Ideal Chunk Size for python requests

查看:46
本文介绍了python请求的理想块大小的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

是否有选择块大小的指南?

Is there any guideline on selecting chunk size?

我尝试了不同的块大小,但没有一个提供与浏览器或 wget 下载速度相媲美的下载速度

I tried different chunk size but none of them give download speed comparable to browser or wget download speed

这是我的代码的快照

 r = requests.get(url, headers = headers,stream=True)
 total_length = int(r.headers.get('content-length'))
 if not total_length is None: # no content length header
 for chunk in r.iter_content(1024):
     f.write(chunk)

任何帮助将不胜感激.?

Any help would be appreciated.?

我尝试了不同速度的网络..我能够达到比我的家庭网络更高的速度..但是当我测试 wget 和浏览器时..速度仍然没有可比性

I tried network with different speed.. And I am able to achieve higher speed than my home network.. But when I tested wget and browser.. Speed is still not comparable

谢谢

推荐答案

在读取和写入之间切换会浪费时间,并且块大小的限制仅是您可以存储在内存中的限制.因此,只要您不是很关心降低内存使用量,就可以指定大块大小,例如 1 MB(例如 1024 * 1024)甚至 10 MB.1024 字节范围内的块大小(甚至更小,因为听起来您测试的大小要小得多)会大大减慢进程速度.

You will lose time switching between reads and writes, and the limit of the chunk size is AFAIK only the limit of what you can store in memory. So as long as you aren't very concerned about keeping memory usage down, go ahead and specify a large chunk size, such as 1 MB (e.g. 1024 * 1024) or even 10 MB. Chunk sizes in the 1024 byte range (or even smaller, as it sounds like you've tested much smaller sizes) will slow the process down substantially.

对于您希望从代码中获得尽可能多的性能的非常繁重的情况,您可以查看 io 模块以进行缓冲等.但我认为增加块大小乘以 1000 或 10000 左右的系数可能会让您大获全胜.

For a very heavy-duty situation where you want to get as much performance as possible out of your code, you could look at the io module for buffering etc. But I think increasing the chunk size by a factor of 1000 or 10000 or so will probably get you most of the way there.

这篇关于python请求的理想块大小的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆