如何在执行多个请求时加速Python的urllib2 [英] How to Speed Up Python's urllib2 when doing multiple requests
本文介绍了如何在执行多个请求时加速Python的urllib2的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我正在使用python的urllib2库向特定主机发出几个http请求。每次发出请求时,都会创建一个新的tcp和http连接,这需要花费大量时间。是否有任何方法可以使用urllib2保持tcp / http连接活动?
I am making several http requests to a particular host using python's urllib2 library. Each time a request is made a new tcp and http connection is created which takes a noticeable amount of time. Is there any way to keep the tcp/http connection alive using urllib2?
推荐答案
如果切换到 httplib ,您可以更好地控制底层连接。
If you switch to httplib, you will have finer control over the underlying connection.
例如:
import httplib
conn = httplib.HTTPConnection(url)
conn.request('GET', '/foo')
r1 = conn.getresponse()
r1.read()
conn.request('GET', '/bar')
r2 = conn.getresponse()
r2.read()
conn.close()
这将在同一底层TCP连接上发送2个HTTP GET。
This would send 2 HTTP GETs on the same underlying TCP connection.
这篇关于如何在执行多个请求时加速Python的urllib2的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文