HttpURLConnection的性能问题 [英] Performance issue with HttpURLConnection

查看:167
本文介绍了HttpURLConnection的性能问题的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我基本上使用以下两种方法在WebServer上建立HttpURLConnection:

I'm establishing a HttpURLConnection to a WebServer with basically the following two methods:

    private HttpURLConnection establishConnection(URL url) {
    HttpURLConnection conn = null;
    try {
        conn = (HttpURLConnection) url.openConnection();
        conn = authenticate(conn);
        conn.setRequestMethod(httpMethod);
        conn.setConnectTimeout(50000);
        conn.connect();
        input= conn.getInputStream();
        return conn;
    } catch (IOException e1) {
        e1.printStackTrace();
    }
    return null;
}

private HttpURLConnection authenticate(HttpURLConnection conn) {
    String userpass = webServiceUserName + ":" + webServicePassword;
    byte[] authEncBytes = Base64.encodeBase64(userpass.getBytes());
    String authStringEnc = new String(authEncBytes);
    conn.setRequestProperty("Authorization", "Basic " + authStringEnc);
    return conn;
}

这很好用,服务器正在发送一些XML文件,我可以继续进行.我遇到的问题是,我必须执行约220项操作,它们总共需要25秒钟的处理时间.数据在WebPage中使用,因此25s的响应时间实际上是不可接受的. 上面的代码大约需要:86000036ns(〜86ms),所以我正在寻找一种以某种方式提高速度的方法.我尝试使用org.apache.http.*软件包,但这比我当前的实现要慢一些.

This works quite well, the Server is sending some XML-File and I can continue with it. The Problem I'm encountering is, i have to do about ~220 of these and they add up to about 25s processing time. The data is used in a WebPage, so 25s response time is not really acceptable. The code above takes about: 86000036ns (~86ms), so im searching for a way to improve the speed somehow. I tried using the org.apache.http.* package, but that was a bit slower than my current implementation.

谢谢

马库斯

input=conn.getInputStream(); 造成〜82-85ms的延迟.反正有周围"吗?

input=conn.getInputStream(); Is responsible for ~82-85ms of that delay. Is there anyway "around" it?

Edit2:我也使用了连接管理器 PoolingHttpClientConnectionManager cm = new PoolingHttpClientConnectionManager(); cm.setMaxTotal(200); cm.setDefaultMaxPerRoute(20); HttpHost localhost = new HttpHost(webServiceHostName, 443); cm.setMaxPerRoute(new HttpRoute(localhost), 50); CredentialsProvider credsProvider = new BasicCredentialsProvider(); credsProvider.setCredentials( new AuthScope(webServiceHostName, 443), new UsernamePasswordCredentials(webServiceUserName, webServicePassword)); httpclient = HttpClients.custom().setConnectionManager(cm).setDefaultCredentialsProvider(credsProvider).build();

I used the Connection Manager aswell PoolingHttpClientConnectionManager cm = new PoolingHttpClientConnectionManager(); cm.setMaxTotal(200); cm.setDefaultMaxPerRoute(20); HttpHost localhost = new HttpHost(webServiceHostName, 443); cm.setMaxPerRoute(new HttpRoute(localhost), 50); CredentialsProvider credsProvider = new BasicCredentialsProvider(); credsProvider.setCredentials( new AuthScope(webServiceHostName, 443), new UsernamePasswordCredentials(webServiceUserName, webServicePassword)); httpclient = HttpClients.custom().setConnectionManager(cm).setDefaultCredentialsProvider(credsProvider).build();

但是运行时间增加到约40秒,并且由于非法路径属性"而拒绝Cookie的每一次请求后,我都从Tomcat中收到警告消息

But the runtime increases to ~40s and i get a Warning from my Tomcat after every request that the Cookie was rejeceted because of a "Illegal path attribute"

推荐答案

通过并行下载多个文件,您可能会获得实质性的推动.

You may be able to get a substantial boost by downloading a number of files in parallel.

我有一个项目,我必须通过卫星回程从服务器上下载20个资源(往返延迟大约700毫秒).顺序下载它们大约需要30秒钟;一次5个耗时6.5秒,一次10个耗时3.5秒,一次20个全部耗时超过2.5秒.

I had a project where I had to download 20 resources from a server over a satellite backhaul (around 700ms round-trip delay). Downloading them sequentially took around 30 seconds; 5 at a time took 6.5 seconds, 10 at a time took 3.5 seconds, and all 20 at once was a bit over 2.5 seconds.

这里是一个示例,该示例将同时执行多个下载,并且如果服务器支持,则将使用连接保持活动.

Here is an example which will perform multiple downloads concurrently, and if support by the server, will use connection keep-alive.

import java.io.IOException;
import java.util.ArrayList;
import java.util.List;

import org.apache.http.HttpEntity;
import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpGet;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClients;
import org.apache.http.impl.conn.PoolingHttpClientConnectionManager;
import org.apache.http.protocol.BasicHttpContext;
import org.apache.http.protocol.HttpContext;
import org.apache.http.util.EntityUtils;

public class Downloader {
    private static final int MAX_REQUESTS_PER_ROUTE = 10;
    private static final int MAX_REQUESTS_TOTAL = 50;
    private static final int MAX_THREAD_DONE_WAIT = 60000;

    public static void main(String[] args) throws IOException,
            InterruptedException {

        long startTime = System.currentTimeMillis();

        // create connection manager and http client
        PoolingHttpClientConnectionManager cm = new PoolingHttpClientConnectionManager();
        cm.setDefaultMaxPerRoute(MAX_REQUESTS_PER_ROUTE);
        cm.setMaxTotal(MAX_REQUESTS_TOTAL);
        CloseableHttpClient httpclient = HttpClients.custom()
                .setConnectionManager(cm).build();

        // list of download items
        List<DownloadItem> items = new ArrayList<DownloadItem>();
        items.add(new DownloadItem("http://www.example.com/file1.xml"));
        items.add(new DownloadItem("http://www.example.com/file2.xml"));
        items.add(new DownloadItem("http://www.example.com/file3.xml"));
        items.add(new DownloadItem("http://www.example.com/file4.xml"));

        // create and start download threads
        DownloadThread[] threads = new DownloadThread[items.size()];
        for (int i = 0; i < items.size(); i++) {
            threads[i] = new DownloadThread(httpclient, items.get(i));
            threads[i].start();
        }

        // wait for all threads to complete
        for (int i = 0; i < items.size(); i++) {
            threads[i].join(MAX_THREAD_DONE_WAIT);
        }

        // use content
        for (DownloadItem item : items) {
            System.out.println("uri: " + item.uri + ", status-code: "
                    + item.statusCode + ", content-length: "
                    + item.content.length);
        }

        // done with http client
        httpclient.close();

        System.out.println("Time to download: "
                + (System.currentTimeMillis() - startTime) + "ms");
    }

    static class DownloadItem {
        String uri;
        byte[] content;
        int statusCode;

        DownloadItem(String uri) {
            this.uri = uri;
            content = null;
            statusCode = -1;
        }
    }

    static class DownloadThread extends Thread {
        private final CloseableHttpClient httpClient;
        private final DownloadItem item;

        public DownloadThread(CloseableHttpClient httpClient, DownloadItem item) {
            this.httpClient = httpClient;
            this.item = item;
        }

        @Override
        public void run() {
            try {
                HttpGet httpget = new HttpGet(item.uri);
                HttpContext context = new BasicHttpContext();
                CloseableHttpResponse response = httpClient.execute(httpget,
                        context);
                try {
                    item.statusCode = response.getStatusLine().getStatusCode();
                    HttpEntity entity = response.getEntity();
                    if (entity != null) {
                        item.content = EntityUtils.toByteArray(entity);
                    }
                } finally {
                    response.close();
                }
            } catch (Exception e) {
                e.printStackTrace();
            }
        }
    }

}

这篇关于HttpURLConnection的性能问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆