几个小时并行发送HTTP请求后,ServicePoint对象的大小很大 [英] Big size of ServicePoint object after several hours sending HTTP request in parallel

查看:270
本文介绍了几个小时并行发送HTTP请求后,ServicePoint对象的大小很大的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我们正在使用HttpClient将请求并行发送到远程Web API:

We are using HttpClient to send requests to remote Web API in parallel:

public async Task<HttpResponseMessage> PostAsync(HttpRequestInfo httpRequestInfo)
{
    using (var httpClient = new HttpClient())
    {
        httpClient.BaseAddress = new Uri(httpRequestInfo.BaseUrl);
        if (httpRequestInfo.RequestHeaders.Any())
        {
            foreach (var requestHeader in httpRequestInfo.RequestHeaders)
            {
                httpClient.DefaultRequestHeaders.Add(requestHeader.Key, requestHeader.Value);
            }
        }

        return await httpClient.PostAsync(httpRequestInfo.RequestUrl, httpRequestInfo.RequestBody);
    }
}

可以同时由多个线程调用此API.运行大约四个小时后,我们发现从性能分析工具发生了内存泄漏问题,似乎有两个ServicePoint对象,其中一个很大,约为160 MB.

This API can be called by several threads concurrently. After running about four hours we found memory leaks issue happened, from profiling tool, it seems there are two ServicePoint objects, one of which is quite big, about 160 MB.

据我所知,我可以在代码上方看到一些问题:

From my knowledge, I can see some problems above codes:

  • 我们应该尽可能共享HttpClient实例.在我们的情况下,请求地址和标头可能会有很大的不同,所以这是我们可以做的事情还是不会损害过多的性能?我只是想到我们可以准备一个字典来存储和查找HttpClient实例.
  • 我们没有修改ServicePointDefaultConnectionLimit,因此默认情况下它只能同时向同一服务器发送两个请求.如果将此值更改为更大的值,是否可以解决内存泄漏问题?
  • 我们还取消了HTTPS证书验证:ServicePointManager.ServerCertificateValidationCallback = delegate { return true; };这与问题有关吗?
  • We should share HttpClient instance as possible as we can. In our case, the request address and headers may vary a lot, so is this a point we can do something or it doesn't hurt too much performance? I just think of that we can prepare a dictionary to store and look up HttpClient instances.
  • We didn't modify the DefaultConnectionLimit of ServicePoint, so in default it can only send two requests to the same server concurrently. If we change this value to larger one, the memory leaks problem can be solved?
  • We also suppressed the HTTPS certificate validation: ServicePointManager.ServerCertificateValidationCallback = delegate { return true; }; Does this have something to do with the problem?

由于这个问题不容易重现(需要很多时间),我只需要考虑一下,以便可以优化我们的代码以长期运行.

Due to this issue is not easily reproduced(need a lot of time), I just need some thoughts so that I can optimize our code for long time running.

推荐答案

我自己解释一下这种情况,以防万一其他人以后也遇到这个问题.

Explain the situation myself, just in case others also meet this issue later.

首先,这不是内存泄漏,这是性能问题.

我们在打开代理的虚拟机中测试我们的应用程序.它导致互联网相当慢.因此,在我们的情况下,每个HTTP请求可能要花费3-4秒.随着时间的流逝,ServicePoint队列中将有很多连接.因此,这不是内存泄漏,这仅仅是因为以前的连接没有足够快地完成.

We test our application in virtual machine, on which we opened the proxy. It leads to the internet is quite slow. So in our case, each HTTP request might cost 3-4 seconds. As time going, there will be a lot of connections in the ServicePoint queue. Therefore, it's not memory leaks, that's just because the previous connections are not finished quickly enough.

只需确保每个HTTP请求的速度都不那么慢,一切都会正常.

Just make sure each HTTP request is not that slow, everything becomes normal.

我们还尝试减少HttpClient实例,以提高HTTP请求性能:

We also tried to reduce the HttpClient instances, to increase the HTTP request performance:

private readonly ConcurrentDictionary<HttpRequestInfo, HttpClient> _httpClients;

private HttpClient GetHttpClient(HttpRequestInfo httpRequestInfo)
{
    if (_httpClients.ContainsKey(httpRequestInfo))
    {
        HttpClient value;
        if (!_httpClients.TryGetValue(httpRequestInfo, out value))
        {
            throw new InvalidOperationException("It seems there is no related http client in the dictionary.");
        }

        return value;
    }

    var httpClient = new HttpClient { BaseAddress = new Uri(httpRequestInfo.BaseUrl) };
    if (httpRequestInfo.RequestHeaders.Any())
    {
        foreach (var requestHeader in httpRequestInfo.RequestHeaders)
        {
            httpClient.DefaultRequestHeaders.Add(requestHeader.Key, requestHeader.Value);
        }
    }

    httpClient.DefaultRequestHeaders.ExpectContinue = false;
    httpClient.DefaultRequestHeaders.ConnectionClose = true;
    httpClient.Timeout = TimeSpan.FromMinutes(2);

    if (!_httpClients.TryAdd(httpRequestInfo, httpClient))
    {
        throw new InvalidOperationException("Adding new http client thrown an exception.");
    }

    return httpClient;
}

在我们的例子中,只有请求正文对于相同的服务器地址是不同的.我还重写了HttpRequestInfoEqualsGetHashCode方法.

In our case, only the request body is different for same server address. I also override the Equals and GetHashCode method of HttpRequestInfo.

同时,我们设置了ServicePointManager.DefaultConnectionLimit = int.MaxValue;

Meanwhile, we set ServicePointManager.DefaultConnectionLimit = int.MaxValue;

希望这可以为您提供帮助.

Hopes this can help you.

这篇关于几个小时并行发送HTTP请求后,ServicePoint对象的大小很大的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆