如何增加.NET Core中的传出HTTP requet配额? [英] How to increase outgoing HTTP requets quota in .NET Core?

查看:88
本文介绍了如何增加.NET Core中的传出HTTP requet配额?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试从计算机发送大量HTTP请求.但是,似乎.NET Core或Windows(我不知道)正在限制可以发出的并发HTTP请求的数量,或者在给定的时间段内限制HTTP请求的配额.

I'm trying to send high volume HTTP requests from a machine. But it seems that .NET Core, or Windows I don't know, is restricting the number of concurrent HTTP requests that can go out, or the quota of HTTP requests in a given time fraction.

如何增加这个?我记得我们在.NET Framework中进行了配置,但是我都找不到.

How can I increase this? I remember that we had a configuration in .NET Framework, but I'm unable to find that either.

推荐答案

HTTP 1.1协议建议每个域只能进行2个并发请求. .NET Framework和.NET Core都对桌面应用程序使用此限制. ASP.NET应用程序最多可以有10个并发请求.这两个运行时都允许您更改限制.

The HTTP 1.1 protocol advised that only 2 concurrent requests should be made per domain. Both the .NET Framework and .NET Core use this limit for desktop applications. ASP.NET applicatinos have a limit of 10 concurrent requests. Both runtimes allow you to change the limit.

这个限制对于浏览器来说是很有意义的,但是对于面向服务的应用程序来说太过严格了.如今,浏览器允许大约8个并发连接,并且服务/REST应用程序可以处理更多连接.

This limit made sense for browsers a while ago but it's too restrictive for service oriented applications. Browsers allow around 8 concurrent connections nowadays and service/REST applications can handle more.

ServicePointManager.DefaultConnectionLimit 可用于更改整个应用程序的限制,例如:

ServicePointManager.DefaultConnectionLimit can be used to change the limit for the entire application, eg :

ServicePointManager.DefaultConnectionLimit = 100;

您还可以通过将HttpClientHandler与

You can also specify a limit per HttpClient instance, by using an HttpClientHandler with the HttpClientHandler.MaxConnectionsPerServer property set to the desired limit :

var handler = new HttpClientHandler
{
    MaxConnectionsPerServer= 100,
    AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate
};

HttpClient client = new HttpClient(handler);

这样,您可以为每个目标服务设置不同的限制.

This way you can set different limits per target service.

不要着急,并将限制设置为很大的数字.目标服务可能不能能够处理来自同一客户端的20或40个并发请求.写得不好的服务可能会使服务器崩溃或泛洪.并发请求可能彼此阻塞,从而减少实际吞吐量.编写良好的服务可能会对每个客户端或队列请求施加速率限制.

Don't rush and set the limit to a huge number. The target services may not be able to handle 20 or 40 concurrent requests from the same client. Badly written services may crash or flood the server. Concurrent requests may block each other, reducing the actual throughput. Well written services may impose a rate limit per client, or queue requests.

您会惊讶于一些所谓的高流量服务表现如何.我遇到过航空公司服务,如果在一分钟内同时发出多个 10 并发请求,这些服务可能会崩溃.配置错误的负载平衡器会至少持续1-2分钟将流量继续引导到那些崩溃的服务,直到服务重新启动为止,从而使重试变得毫无意义.

You'd be surprised how badly some supposedly high-traffic services behave. I've encountered airline services that could crash if more than just 10 concurrent requests were made over a minute. Badly configured load balancers would still direct traffic to those crashed services for at least 1-2 minutes until the service restarted, making retries meaningless.

这篇关于如何增加.NET Core中的传出HTTP requet配额?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆