客户端服务器命令具有可变延迟的设计模式 [英] Client Server Command Design pattern with variable delays

查看:153
本文介绍了客户端服务器命令具有可变延迟的设计模式的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在编写一个客户端程序来控制一个服务器,从而控制一些大的硬件。服务器需要接收命令来初始化,启动,停止和控制硬件。



从客户端到服务器的连接是通过TCP或UDP套接字。每个命令都使用SCADA协议(例如Modbus或DNP3)封装在适当的消息中。



初始化阶段的一部分涉及将一系列命令从客户端发送到服务器。在某些情况下,命令之间必须有几秒钟的延迟,以防止同时初始化多个子系统。延迟的值取决于命令的类型。



我在想,命令设计模式是一个很好的方法。客户端实例化了ConcreteCommands,Invoker将其放置在队列中。我不知道如何整合可变延迟,以及是否有更好的模式涉及定时器和队列来处理发送具有可变延迟的
消息。



我使用C#,但这可能是无关紧要的,因为它更像是一个设计模式的问题。

解决方案

听起来你需要存储类型的映射到延迟。当您的服务器启动时,您可以缓存这些延迟时间吗?然后调用在指定的延迟后处理命令的方法?



服务器启动时:

 字典< Type,int> typeToDelayMapping = GetTypeToDelayMapping(); 

当命令到达服务器时,服务器可以调用:

  InvokeCommand(ICommand命令,int delayTimeInMilliseconds)

像这样:

  InvokeCommand(command,typeToDelayMapping [type]); 


I am writing a client program to control a server which is in turn controlling some large hardware. The server needs to receive commands to initialize, start, stop and control the hardware.

The connection from the client to the server is via a TCP or UDP socket. Each command is encapsulated in an appropriate message using a SCADA protocol (e.g. Modbus or DNP3).

Part of the initialization phase involves sending a sequence of commands from the client to the server. In some cases there must be a delay in seconds between the commands to prevent multiple sub-systems being initialized at the same time. The value of the delay depends on the type of command.

I'm thinking that the Command Design Pattern is a good approach to follow here. The client instantiates ConcreteCommands and the Invoker places it in a queue. I'm not sure how to incorporate the variable delay and whether there's a better pattern which involves a timer and a queue to handle sending messages with variable delays.

I'm using C# but this is probably irrelevant since it's more of a design pattern question.

解决方案

It sounds like you need to store a mapping of types to delay. When your server starts, could you cache those delay times? Then call a method that processes the command after a specified delay?

When the server starts:

Dictionary<Type, int> typeToDelayMapping = GetTypeToDelayMapping();

When a command reaches the server, the server can call this:

InvokeCommand(ICommand command, int delayTimeInMilliseconds)

Like so:

InvokeCommand(command, typeToDelayMapping[type]);

这篇关于客户端服务器命令具有可变延迟的设计模式的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆