如何使用请求库从 http 请求中获取 IP 地址? [英] How do I get the IP address from a http request using the requests library?
问题描述
我正在使用 python 中的请求库发出 HTTP 请求,但我需要响应 HTTP 请求的服务器的 IP 地址,并且我试图避免进行两次调用(并且可能具有与请求不同的 IP 地址)响应请求的一个).
这可能吗?是否有任何 python HTTP 库允许我这样做?
PS:我还需要发出 HTTPS 请求并使用经过身份验证的代理.
更新 1:
示例:
导入请求代理 = {http":http://user:password@10.10.1.10:3128",https":http://user:password@10.10.1.10:1080",}response = requests.get("http://example.org", proxies=proxies)response.ip # 这不存在,这只是我想做的
然后,我想知道响应中的方法或属性将请求连接到哪些 IP 地址.在其他库中,我可以通过找到 sock 对象并使用 getpeername()
函数来做到这一点.
事实证明它相当复杂.
这是使用 requests
1.2.3 版时的猴子补丁:
将 _make_request
方法包装在 HTTPConnectionPool
上以将来自 socket.getpeername()
的响应存储在 HTTPResponse
上实例.
对于我在 python 2.7.3 上,这个实例在 response.raw._original_response
上可用.
from requests.packages.urllib3.connectionpool import HTTPConnectionPooldef _make_request(self,conn,method,url,**kwargs):响应 = self._old_make_request(conn,method,url,**kwargs)袜子 = getattr(conn,'sock',False)如果袜子:setattr(响应,'peer',sock.getpeername())别的:setattr(响应,'peer',无)返回响应HTTPConnectionPool._old_make_request = HTTPConnectionPool._make_requestHTTPConnectionPool._make_request = _make_request进口请求r = requests.get('http://www.google.com')打印 r.raw._original_response.peer
产量:
('2a00:1450:4009:809::1017', 80, 0, 0)
<小时>
啊,如果涉及代理或响应分块,则不会调用 HTTPConnectionPool._make_request
.
所以这里有一个新版本的补丁 httplib.getresponse
代替:
导入httplibdef getresponse(self,*args,**kwargs):response = self._old_getresponse(*args,**kwargs)如果 self.sock:response.peer = self.sock.getpeername()别的:response.peer = 无返回响应httplib.HTTPConnection._old_getresponse = httplib.HTTPConnection.getresponsehttplib.HTTPConnection.getresponse = getresponse进口请求def check_peer(resp):orig_resp = resp.raw._original_response如果 hasattr(orig_resp,'peer'):返回 getattr(orig_resp,'peer')
运行:
<预><代码>>>>r1 = requests.get('http://www.google.com')>>>check_peer(r1)('2a00:1450:4009:808::101f', 80, 0, 0)>>>r2 = requests.get('https://www.google.com')>>>check_peer(r2)('2a00:1450:4009:808::101f', 443, 0, 0)>>>r3 = requests.get('http://wheezyweb.readthedocs.org/en/latest/tutorial.html#what-you-ll-build')>>>check_peer(r3)('162.209.99.68', 80)还检查了设置代理的运行情况;返回代理地址.
<小时>更新 2016/01/19
est 优惠 不需要猴子补丁的替代方案:
rsp = requests.get('http://google.com', stream=True)# 在你消耗身体之前,尽可能地获取 IP!!!!!!!!!!打印 rsp.raw._fp.fp._sock.getpeername()# 使用调用 read() 的主体,之后 fileno 不再可用.打印 rsp.content
<小时>
更新 2016/05/19
从评论中复制此处以供查看,Richard Kenneth Niescior 提供以下已确认有效的内容使用请求 2.10.0 和 Python 3.
rsp=requests.get(..., stream=True)rsp.raw._connection.sock.getpeername()
<小时>
更新 2019/02/22
请求版本为 2.19.1 的 Python3.
resp=requests.get(..., stream=True)resp.raw._connection.sock.socket.getsockname()
<小时>
更新 2020/01/31
带有请求 2.22.0 的 Python3.8
resp = requests.get('https://www.google.com', stream=True)resp.raw._connection.sock.getsockname()
I am making HTTP requests using the requests library in python, but I need the IP address from the server that responded to the HTTP request and I'm trying to avoid making two calls (and possibly having a different IP address from the one that responded to the request).
Is that possible? Does any python HTTP library allow me to do that?
PS: I also need to make HTTPS requests and use an authenticated proxy.
Update 1:
Example:
import requests
proxies = {
"http": "http://user:password@10.10.1.10:3128",
"https": "http://user:password@10.10.1.10:1080",
}
response = requests.get("http://example.org", proxies=proxies)
response.ip # This doesn't exist, this is just an what I would like to do
Then, I would like to know to which IP address requests are connected from a method or property in the response. In other libraries, I was able to do that by finding the sock object and using the getpeername()
function.
It turns out that it's rather involved.
Here's a monkey-patch while using requests
version 1.2.3:
Wrapping the _make_request
method on HTTPConnectionPool
to store the response from socket.getpeername()
on the HTTPResponse
instance.
For me on python 2.7.3, this instance was available on response.raw._original_response
.
from requests.packages.urllib3.connectionpool import HTTPConnectionPool
def _make_request(self,conn,method,url,**kwargs):
response = self._old_make_request(conn,method,url,**kwargs)
sock = getattr(conn,'sock',False)
if sock:
setattr(response,'peer',sock.getpeername())
else:
setattr(response,'peer',None)
return response
HTTPConnectionPool._old_make_request = HTTPConnectionPool._make_request
HTTPConnectionPool._make_request = _make_request
import requests
r = requests.get('http://www.google.com')
print r.raw._original_response.peer
Yields:
('2a00:1450:4009:809::1017', 80, 0, 0)
Ah, if there's a proxy involved or the response is chunked, the HTTPConnectionPool._make_request
isn't called.
So here's a new version patching httplib.getresponse
instead:
import httplib
def getresponse(self,*args,**kwargs):
response = self._old_getresponse(*args,**kwargs)
if self.sock:
response.peer = self.sock.getpeername()
else:
response.peer = None
return response
httplib.HTTPConnection._old_getresponse = httplib.HTTPConnection.getresponse
httplib.HTTPConnection.getresponse = getresponse
import requests
def check_peer(resp):
orig_resp = resp.raw._original_response
if hasattr(orig_resp,'peer'):
return getattr(orig_resp,'peer')
Running:
>>> r1 = requests.get('http://www.google.com')
>>> check_peer(r1)
('2a00:1450:4009:808::101f', 80, 0, 0)
>>> r2 = requests.get('https://www.google.com')
>>> check_peer(r2)
('2a00:1450:4009:808::101f', 443, 0, 0)
>>> r3 = requests.get('http://wheezyweb.readthedocs.org/en/latest/tutorial.html#what-you-ll-build')
>>> check_peer(r3)
('162.209.99.68', 80)
Also checked running with proxies set; proxy address is returned.
Update 2016/01/19
est offers an alternative that doesn't need the monkey-patch:
rsp = requests.get('http://google.com', stream=True)
# grab the IP while you can, before you consume the body!!!!!!!!
print rsp.raw._fp.fp._sock.getpeername()
# consume the body, which calls the read(), after that fileno is no longer available.
print rsp.content
Update 2016/05/19
From the comments, copying here for visibility, Richard Kenneth Niescior offers the following that is confirmed working with requests 2.10.0 and Python 3.
rsp=requests.get(..., stream=True)
rsp.raw._connection.sock.getpeername()
Update 2019/02/22
Python3 with requests version 2.19.1.
resp=requests.get(..., stream=True)
resp.raw._connection.sock.socket.getsockname()
Update 2020/01/31
Python3.8 with requests 2.22.0
resp = requests.get('https://www.google.com', stream=True)
resp.raw._connection.sock.getsockname()
这篇关于如何使用请求库从 http 请求中获取 IP 地址?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!