当连接中断时,读取由 urllib2 制作的流永远不会恢复 [英] reading a stream made by urllib2 never recovers when connection got interrupted

查看:22
本文介绍了当连接中断时,读取由 urllib2 制作的流永远不会恢复的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在尝试使我的一个 Python 应用程序在连接中断的情况下更加健壮时,我发现调用 urllib2 生成的 http 流的读取函数可能会永远阻止脚本.

While trying to make one of my python applications a bit more robust in case of connection interruptions I discovered that calling the read function of an http-stream made by urllib2 may block the script forever.

我认为读取函数会超时并最终引发异常,但在读取函数调用期间连接中断时,情况并非如此.

I thought that the read function will timeout and eventually raise an exception but this does not seam to be the case when the connection got interrupted during a read function call.

这是导致问题的代码:

import urllib2

while True:
    try:
        stream = urllib2.urlopen('http://www.google.de/images/nav_logo4.png')
        while stream.read(): pass
        print "Done"
    except:
        print "Error"

(如果您尝试脚本,您可能需要多次中断连接,然后才能达到脚本永远无法恢复的状态)

(If you try out the script you probably need to interrupt the connection several times before you will reach the state from which the script never recovers)

我通过 Winpdb 观看了脚本,并制作了脚本无法恢复的状态的屏幕截图(即使网络再次可用).

I watched the script via Winpdb and made a screenshot of the state from which the script does never recover (even if the network got available again).

Winpdb http://img10.imageshack.us/img10/6716/urllib2.jpg

有没有办法创建一个即使网络连接中断也能继续可靠工作的python脚本?(我宁愿避免在额外的线程中执行此操作.)

Is there a way to create a python script that will continue to work reliable even if the network connection got interrupted? (I would prefer to avoid doing this inside an extra thread.)

推荐答案

尝试类似:

import socket
socket.setdefaulttimeout(5.0)
   ...
try:
   ...
except socket.timeout:
   (it timed out, retry)

这篇关于当连接中断时,读取由 urllib2 制作的流永远不会恢复的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆