你如何以编程方式下载 Java 网页 [英] How do you Programmatically Download a Webpage in Java
本文介绍了你如何以编程方式下载 Java 网页的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我希望能够获取网页的 html 并将其保存到 String
,以便我可以对其进行一些处理.另外,我如何处理各种类型的压缩.
I would like to be able to fetch a web page's html and save it to a String
, so I can do some processing on it. Also, how could I handle various types of compression.
我将如何使用 Java 做到这一点?
How would I go about doing that using Java?
推荐答案
这里是一些使用 Java 的测试代码 URL 类.不过,我建议在处理异常或将它们传递到调用堆栈方面做得比我在这里做得更好.
Here's some tested code using Java's URL class. I'd recommend do a better job than I do here of handling the exceptions or passing them up the call stack, though.
public static void main(String[] args) {
URL url;
InputStream is = null;
BufferedReader br;
String line;
try {
url = new URL("http://stackoverflow.com/");
is = url.openStream(); // throws an IOException
br = new BufferedReader(new InputStreamReader(is));
while ((line = br.readLine()) != null) {
System.out.println(line);
}
} catch (MalformedURLException mue) {
mue.printStackTrace();
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
try {
if (is != null) is.close();
} catch (IOException ioe) {
// nothing to see here
}
}
}
这篇关于你如何以编程方式下载 Java 网页的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文