使用webclient downloaddata方法从URL下载图像时,从浏览器轻松下载URL,从而获得超时异常 [英] Getting time out exception using webclient downloaddata method to download images from URL while I hit URL from browser easy to get download

查看:324
本文介绍了使用webclient downloaddata方法从URL下载图像时,从浏览器轻松下载URL,从而获得超时异常的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

Webclient downloaddata() taking too much time and giving time out exception when we use it in foreach loop





我尝试过:





What I have tried:

<pre>   WebClient web = new WebClient();




byt = web.DownloadData(new Uri(photopath));

推荐答案

这不是我们可以回答的问题:这取决于您和您正在访问的网站。

我们不知道网站有多繁忙,传输了多少数据,需要多长时间。我们不知道网站对重复请求做了什么:有些可能会故意延迟响应以减轻可能的DDOS攻击。



所以首先在网址上做笔记你是请求,并尝试通过浏览器手动访问它们。他们的服务速度有多快?传输了多少数据?如果你快速连续开始其中几个,会发生什么?有什么变化吗?



可能需要延长超时时间,可能需要减慢访问网站的频率 - 我们可以'告诉我。也许,请联系网站管理员,看看他说了什么。



所以从收集信息开始 - 我们不能为你做到这一点!
This isn't a question we can answer: it's down to you and the site you are accessing.
We don't know how busy the site is, how much data is being transferred, how long it takes. We don't know what the site does with repeated requests: some may deliberately delay responses to mitigate possible DDOS attacks.

So start by making notes on the URL's you are requesting, and try accessing them manually via a browser. How fast are they being served? How much data is being transferred? What happens if you start several of them in quick succession? Does anything change?

It may be that you need to extend timeouts, it may be that you need to slow the frequency at which you access the site - we can't tell. And possibly, contact the site admin and see what he says.

So start by gathering information - we can't do that for you!


这篇关于使用webclient downloaddata方法从URL下载图像时,从浏览器轻松下载URL,从而获得超时异常的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆