有没有办法在 ASP.NET Web API 应用程序中全局限制并行任务的数量? [英] Is there a way to limit the number of parallel Tasks globally in an ASP.NET Web API application?

查看:27
本文介绍了有没有办法在 ASP.NET Web API 应用程序中全局限制并行任务的数量?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个 ASP.NET 5 Web API 应用程序,它包含一个方法,该方法从 List 获取对象并向服务器发出 HTTP 请求,一次 5 个,直到所有请求都完全的.这是使用 SemaphoreSlimList() 和等待 Task.WhenAll() 完成的,类似于示例代码片段下面:

公共异步任务DoStuff(List inputData){const int maxDegreeOfParallelism = 5;var tasks = new List>();使用 var Throttle = new SemaphoreSlim(maxDegreeOfParallelism);foreach(输入数据中的 var 输入){任务.添加(ExecHttpRequestAsync(输入,油门));}列表resposnes = await Task.WhenAll(tasks).ConfigureAwait(false);回复回复;}私有异步任务ExecHttpRequestAsync(输入输入,SemaphoreSlim 节流阀){等待油门.WaitAsync().ConfigureAwait(false);尝试{使用 var request = new HttpRequestMessage(HttpMethod.Post, "https://foo.bar/api");request.Content = new StringContent(JsonConvert.SerializeObject(input, Encoding.UTF8, "application/json");var response = await HttpClientWrapper.SendAsync(request).ConfigureAwait(false);var responseBody = await response.Content.ReadAsStringAsync().ConfigureAwait(false);var responseObject = JsonConvert.DeserializeObject(responseBody);返回响应对象;}最后{节流阀.Release();}}

这很有效,但是我希望限制在整个应用程序中全局并行执行的任务总数,以便扩展该应用程序.例如,如果对我的 API 的 50 个请求同时传入,这将启动最多 250 个并行运行的任务.如果我想将在任何给定时间执行的任务总数限制为 100,是否有可能做到这一点?也许通过 Queue ?框架会自动阻止执行太多任务吗?或者我是否以错误的方式解决了这个问题,我是否需要将传入的请求排队到我的应用程序中?

解决方案

我将假设代码是固定的,即 Task.Run 被删除并且 WaitAsync/Release 被调整为限制 HTTP 调用而不是 List.Add.

<块引用>

我希望限制在整个应用程序中全局并行执行的任务总数,以便扩展该应用程序.

这对我来说没有意义.限制您的任务会限制您的扩展.

<块引用>

例如,如果对我的 API 的 50 个请求同时传入,这将启动最多 250 个并行运行的任务.

同时,当然,但不是并行.重要的是要注意,这些不是 250 个线程,也不是等待空闲线程池线程运行的 250 个受 CPU 限制的操作.这些是承诺任务,而不是委托任务,因此它们不会运行"在一个线程上.内存中只有 250 个对象.

<块引用>

如果我想将在任何给定时间执行的任务总数限制为 100,是否有可能实现?

由于(这些类型的)任务只是内存中的对象,因此不需要限制它们,就像您需要限制 string 的数量一样列出.在您确实需要的地方应用限制;例如,每个请求同时完成的 HTTP 调用数.或者每个主机.

<块引用>

框架会自动阻止执行过多的任务吗?

该框架没有类似的内置功能.<​​/p><块引用>

也许通过队列?或者我是否以错误的方式解决了这个问题,我是否需要将传入的请求排队到我的应用程序中?

已经有一个请求队列.它由 IIS(或任何您的主机)处理.如果您的服务器太忙(或突然变得很忙),请求将排队等待,您无需执行任何操作.

I have an ASP.NET 5 Web API application which contains a method that takes objects from a List<T> and makes HTTP requests to a server, 5 at a time, until all requests have completed. This is accomplished using a SemaphoreSlim, a List<Task>(), and awaiting on Task.WhenAll(), similar to the example snippet below:

public async Task<ResponseObj[]> DoStuff(List<Input> inputData)
{
  const int maxDegreeOfParallelism = 5;
  var tasks = new List<Task<ResponseObj>>();

  using var throttler = new SemaphoreSlim(maxDegreeOfParallelism);
  foreach (var input in inputData)
  {
    tasks.Add(ExecHttpRequestAsync(input, throttler));
  }

  List<ResponseObj> resposnes = await Task.WhenAll(tasks).ConfigureAwait(false);

  return responses;
}

private async Task<ResponseObj> ExecHttpRequestAsync(Input input, SemaphoreSlim throttler)
{
  await throttler.WaitAsync().ConfigureAwait(false);
  
  try
  {
    using var request = new HttpRequestMessage(HttpMethod.Post, "https://foo.bar/api");
    request.Content = new StringContent(JsonConvert.SerializeObject(input, Encoding.UTF8, "application/json");

    var response = await HttpClientWrapper.SendAsync(request).ConfigureAwait(false);
    var responseBody = await response.Content.ReadAsStringAsync().ConfigureAwait(false);
    var responseObject = JsonConvert.DeserializeObject<ResponseObj>(responseBody);

    return responseObject;
  }
  finally
  {
    throttler.Release();
  }
}

This works well, however I am looking to limit the total number of Tasks that are being executed in parallel globally throughout the application, so as to allow scaling up of this application. For example, if 50 requests to my API came in at the same time, this would start at most 250 tasks running parallel. If I wanted to limit the total number of Tasks that are being executed at any given time to say 100, is it possible to accomplish this? Perhaps via a Queue<T>? Would the framework automatically prevent too many tasks from being executed? Or am I approaching this problem in the wrong way, and would I instead need to Queue the incoming requests to my application?

解决方案

I'm going to assume the code is fixed, i.e., Task.Run is removed and the WaitAsync / Release are adjusted to throttle the HTTP calls instead of List<T>.Add.

I am looking to limit the total number of Tasks that are being executed in parallel globally throughout the application, so as to allow scaling up of this application.

This does not make sense to me. Limiting your tasks limits your scaling up.

For example, if 50 requests to my API came in at the same time, this would start at most 250 tasks running parallel.

Concurrently, sure, but not in parallel. It's important to note that these aren't 250 threads, and that they're not 250 CPU-bound operations waiting for free thread pool threads to run on, either. These are Promise Tasks, not Delegate Tasks, so they don't "run" on a thread at all. It's just 250 objects in memory.

If I wanted to limit the total number of Tasks that are being executed at any given time to say 100, is it possible to accomplish this?

Since (these kinds of) tasks are just in-memory objects, there should be no need to limit them, any more than you would need to limit the number of strings or List<T>s. Apply throttling where you do need it; e.g., number of HTTP calls done simultaneously per request. Or per host.

Would the framework automatically prevent too many tasks from being executed?

The framework has nothing like this built-in.

Perhaps via a Queue? Or am I approaching this problem in the wrong way, and would I instead need to Queue the incoming requests to my application?

There's already a queue of requests. It's handled by IIS (or whatever your host is). If your server gets too busy (or gets busy very suddenly), the requests will queue up without you having to do anything.

这篇关于有没有办法在 ASP.NET Web API 应用程序中全局限制并行任务的数量?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆