如何在 .NET 中对服务的工作负载进行负载平衡 [英] How to load-balance the workload of a service in .NET

查看:35
本文介绍了如何在 .NET 中对服务的工作负载进行负载平衡的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在考虑使用面向服务的架构 (SOA) 构建应用程序.

I am thinking of building an application using a Service Oriented Architecture (SOA).

这种架构不像微服务解决方案那么复杂和凌乱(我认为),但我面临着类似的设计问题.想象一下,我有 ServiceA 类型的服务,它们将工作发送到 ServiceB 类型的服务.我想,如果我使用队列,那么负载平衡将不成问题(因为消费者将从队列中获取他们可以处理的内容).但是队列往往会在代码中产生一些糟糕的异步,需要额外的努力来修复.所以,我更倾向于在服务之间使用 HTTP 调用,利用 C# 高效且惊人的 async/await 特性.但这会在共享工作负载和检测饱和或死机服务方面产生问题.

This architecture is not as complex and messy as a microservices solution (I think), but I am facing similar design problems. Imagine I have services of type ServiceA that send work to services of type ServiceB. I guess, if I use a queue, then load balancing will not be a problem (since consumers will take what they can handle from the queue). But queues tend to generate some bad asynchrony in the code that requires extra effort to fix. So, I was more inclined to use HTTP calls between services, using the efficient and amazing async/await feature of C#. But this generates issues on sharing the workload and detecting services that are saturated or dead.

所以我的问题是:

  1. 是否有一个队列支持某种 async/await 功能,其功能类似于 HTTP 调用,可在您需要的地方返回结果,而不是在某些无法继续原始执行的回调中流量?
  2. 如何在使用 HTTP 时对服务之间的流量进行负载平衡并检测不适合新分配的节点?我的意思是,我可能可以从头开始自己设计一些东西,但现在应该有一些标准的方法或库或框架来做到这一点.我在网上找到的最好的是 this,但它是为微服务构建的,所以我不确定我是否可以毫无问题地使用它.
  1. Is there a queue that supports some sort of async/await feature and that functions like an HTTP call that returns the result where you need it and not in some callback where you cannot continue your original execution flow?
  2. How do I load-balance the traffic between services and detect nodes that are not suitable for new assignments when using HTTP? I mean, I can probably design something by myself from scratch, but there ought to be some standard way or library or framework to do that by now. The best I found online was this, but it is built for microservices, so I am not sure if I can use it without problems or overkills.

更新:我现在发现了这个问题,它也要求等待队列:awaitable Task based queue...还发现了 Kubernetes、Marathon 等.

Update: I have now discovered this question, that also asks for awaitable queues: awaitable Task based queue ...and also discovered Kubernetes, Marathon, and the like.

推荐答案

关于您的第一个问题,NServiceBus 是一个用于 .NET 的商业框架,它抽象了消息传输并在它们之上添加了许多功能,它具有以下确切功能:你正在寻找.他们实际上称之为callbacks",用法如下:

Regarding your first question, NServiceBus, which is a commercial framework for .NET that abstracts message transports and adds many features on top of them, has the exact feature that you are looking for. They actually call it "callbacks" and the usage is as follows:

假设您有一条消息要发送到后端服务,并有一个您期望返回的响应,您可以在 ServiceA 中执行:

Assuming you have a Message to send to a backend service and a Response that you expect back, you would do, in ServiceA:

var message = new Message();
var response = await endpoint.Request<ResponseMessage>(message);
log.Info($"Callback received with response:{response.Result}");

其中端点是一个 NServiceBus 工件,它允许您发送消息和接收消息.

Where endpoint is an NServiceBus artifact that allows you to send messages and receive messages.

这个简单的语法将做的是将 Message 放入队列中并等待(异步),直到后端服务处理该消息并回复它.响应是队列中响应类型的消息.

What this simple syntax will do is put Message in a queue and wait (asynchronously) until the message has been handled by a backend service and it has replied to it. The response is a message of type Response in a queue.

在 ServiceB 中,您会:

In ServiceB, you would do:

public class Handler : IHandleMessages<Message>
{
  public Task Handle(Message message, IMessageHandlerContext context)
  {
    var responseMessage = new ResponseMessage
    {
        Result = "TheResult"
    };
    return context.Reply(responseMessage);
  }
}

这允许您让多个 ServiceA 节点向多个 ServiceB 节点发送消息(在单个队列上竞争消费者).NServiceBus 负责将每个给定消息的响应路由到正确的 ServerA.

This allows you to have multiple ServiceA nodes sending messages to multiple ServiceB nodes (competing consumers on a single queueu). NServiceBus takes care of routing the response to the right ServerA for every given message.

请注意,这样做的缺点是,如果 ServerA 在等待响应时宕机,您将永远不会收到响应.因此,对于大多数场景,不建议使用此模式.

Note that this has the disadvantage that if ServerA goes down while waiting a response, you'll never receive the response. For this reason, this pattern is not recommended for most scenarios.

关于您的问题 2,我会说负载均衡器可以完成这项工作.对于更复杂的场景,您可以查看 Service Fabric.

Regarding your question number 2, I would say a load balancer would do the job. For more complex scenarios, you could look at Service Fabric.

这篇关于如何在 .NET 中对服务的工作负载进行负载平衡的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆