如何在 .NET Core Web API 中配置并发? [英] How to configure concurrency in .NET Core Web API?
问题描述
在过去的 WCF 时代,您可以通过 MaxConcurrentCalls
设置来控制服务并发性.MaxConcurrentCalls
默认为 16 个并发调用,但您可以根据需要提高或降低该值.
In the old WCF days, you had control over service concurrency via MaxConcurrentCalls
setting. MaxConcurrentCalls
defaulted to 16 concurrent calls but you could raise or lower that value based upon your needs.
您如何控制 .NET Core Web API 中的服务器端并发?在我们的案例中,我们可能需要限制它,因为过多的并发请求会影响整体服务器性能.
How do you control server side concurrency in .NET Core Web API? We probably need to limit it in our case as too many concurrent requests can impede overall server performance.
推荐答案
ASP.NET Core 应用程序的并发性由其 网络服务器.例如:
ASP.NET Core application concurrency is handled by its web server. For example:
var host = new WebHostBuilder()
.UseKestrel(options => options.ThreadCount = 8)
由于 Kestrel 基于异步的实现,不建议将 Kestrel 线程数设置为像 1K
这样的大值.
It is not recommended to set Kestrel thread count to a large value like 1K
due to Kestrel async-based implementation.
更多信息:Kestrel 是否使用单线程处理像 Node.js 这样的请求?
新的 Limits
属性已在 ASP.NET Core 2.0 Preview 2 中引入.
您现在可以为以下内容添加限制:
You can now add limits for the following:
- 最大客户端连接数
- 最大请求正文大小
- 最大请求正文数据速率
例如:
.UseKestrel(options =>
{
options.Limits.MaxConcurrentConnections = 100;
}
IIS
当 Kestrel 在反向代理后面运行时,您可以调整代理本身.例如,您可以在 web.config
或 aspnet.config
中配置 IIS 应用程序池:
IIS
When Kestrel runs behind a reverse proxy you could tune the proxy itself. For example, you could configure IIS application pool in web.config
or in aspnet.config
:
<configuration>
<system.web>
<applicationPool
maxConcurrentRequestsPerCPU="5000"
maxConcurrentThreadsPerCPU="0"
requestQueueLimit="5000" />
</system.web>
</configuration>
当然 Nginx 和 Apache 有自己的并发设置.
Of course Nginx and Apache have their own concurrency settings.
这篇关于如何在 .NET Core Web API 中配置并发?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!