如何让python成功从互联网上下载大图像 [英] How to get python to successfully download large images from the internet

查看:230
本文介绍了如何让python成功从互联网上下载大图像的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

所以我一直在使用

urllib.request.urlretrieve(URL, FILENAME)

下载互联网图像。它工作得很好,但在一些图像上失败了。它失败的那些似乎是更大的图像 - 例如。 http://i.imgur.com/DEKdmba.jpg 。它下载它们很好,但当我尝试打开这些文件时,照片查看器给我错误windows photo viewer无法打开此图片,因为该文件似乎已损坏或损坏太多了。

to download images of the internet. It works great, but fails on some images. The ones it fails on seem to be the larger images- eg. http://i.imgur.com/DEKdmba.jpg. It downloads them fine, but when I try to open these files photo viewer gives me the error "windows photo viewer cant open this picture because the file appears to be damaged corrupted or too large".

它可能无法下载这些的原因,我该如何解决?

What might be the reason it can't download these, and how can I fix this?

编辑:经过进一步研究后,我不认为这个问题是大型图像 - 它设法下载更大的图像。它似乎是一些随机的,每当我再次运行脚本时它永远不会下载。现在我更加困惑

after looking further, I dont think the problem is large images- it manages to download larger ones. It just seems to be some random ones that it can never download whenever I run the script again. Now I'm even more confused

推荐答案

过去,我使用此代码从互联网上复制。我对大文件没有任何问题。

In the past, I have used this code for copying from the internet. I have had no trouble with large files.

def download(url):
    file_name = raw_input("Name: ")
    u = urllib2.urlopen(url)
    f = open(file_name, 'wb')
    meta = u.info()
    file_size = int(meta.getheaders("Content-Length")[0])
    print "Downloading: %s Bytes: %s" % (file_name, file_size)  
    file_size_dl = 0
    block_size = 8192
    while True:
        buffer = u.read(block_size)
        if not buffer:
            break 

这篇关于如何让python成功从互联网上下载大图像的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆