“取出时超时” URLFetch GAE / J [英] "Timeout while fetching" URLFetch GAE/J
问题描述
我使用XMLReader来简单地阅读下面的提要。
URLConnection urlConnection = url.openConnection();
XmlReader reader = new XmlReader(urlConnection);
当我调用这个函数时,我会在5秒钟内收到一个IOException异常提取时超时
。所以我试图将超时设置为最大值。 (10秒),但仍然没有运气,并且在5秒内仍然是IOExeption。
urlConnection.setConnectTimeout(10000);
(最大值在文档中说明: http://code.google.com/intl/nl-NL/appengine/docs/java/urlfetch/ overview.html )
似乎Feed的大小太大。当我打电话给一个较小的Feed
时,它可以正常工作。有没有任何解决方法或解决方案?我需要能够调用较大的提要。
您应该使用 setReadTimeout
设置阅读截止日期的方法:
urlConnection.setReadTimeout(10000); // 10秒
您应该可以在10秒内下载更大的提要。
如果仍有问题,请尝试拨弄这个不同的方法。
I'm using the XMLReader to simply read a feed like below.
URLConnection urlConnection = url.openConnection();
XmlReader reader = new XmlReader(urlConnection);
When this is called I receive within 5 seconds an IOException "Timeout while fetching". So I tried to set the timeouts to the max. (10 sec) but still no luck and still an IOExeption in 5 sec.
urlConnection.setConnectTimeout(10000);
(the max is stated in documentation: http://code.google.com/intl/nl-NL/appengine/docs/java/urlfetch/overview.html)
Seems that the size of the feed is too large. When I call a smaller feed it works properly. Is there any workaround or solution for this? I need to be able to call larger feeds.
You should use setReadTimeout
method that sets the read deadline:
urlConnection.setReadTimeout(10000); //10 Sec
You should be able to download larger feeds in 10 seconds.
If you still have problem, try to fiddle with this different approach.
这篇关于“取出时超时” URLFetch GAE / J的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!