Modelica 仿真和方程初始化的总时间计算 [英] Modelica total time calculation of simulation and equation initialization

查看:39
本文介绍了Modelica 仿真和方程初始化的总时间计算的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想测量 DAE 系统的总模拟和初始化时间.我对挂钟时间感兴趣(就像 Matlab 中函数 tic-toc 给出的那样).

I would like to measure the total simulation and initialization time of a system of DAEs. I am interested in the wall-clock time (like the one given in Matlab by the function tic-toc).

我注意到在 Modelica 中模拟时间有不同的标志,但实际上我得到的时间与我按下模拟按钮到模拟结束所经过的时间相比非常小(大约用我的时钟测量)电话).

I noticed in Modelica there are different flags for the simulation time but actually the time I get is very small compared to the time that elapses since I press the simulation button to the end of the simulation (approximately measured with the clock of my phone).

我猜这个短时间只是模拟所需的时间,不包括eqs系统的初始化.

I guess this short time is just the time required for the simulation and it does not include the initialization of the system of eqs.

有没有办法计算这个总时间?

Is there a way to calculate this total time?

在此先感谢您,

加布里埃尔

亲爱的马可,非常感谢您非常详细和有用的回复!

Dear Marco, Thank you so much for your extremely detailed and useful reply!

我实际上使用的是 OpenModelica 而不是 Dymola,所以不幸的是,我必须构建为我执行此操作的函数,而且我对 OpenModelica 语言非常陌生.

I am actually using OpenModelica and not Dymola so unfortunately I have to build the function that does it for me and I am very new with OpenModelica language.

到目前为止,我有一个基于 DAE 模拟物理行为的模型.现在,我正在尝试构建您在此处建议的内容:

So far, I have a model that simulate the physical behavior based on a DAEs. Now, I am trying to build what you suggest here:

使用 get time(),您可以构建一个函数:在 t_start 转换模型时读取系统时间并模拟 0 秒再次读取系统时间,并且当 t_stop 计算 t_start 和 t_stop 之间的差异时.

With get time() you can build a function that: reads the system time as t_start translates the model and simulate for 0 seconds reads the system time again and as t_stop computes the difference between t_start and t_stop.

能否请您提供更多详细信息:我可以使用哪个命令在 t_start 时间读取系统并模拟 0 秒?要同时为 t_start 和 t_stop 执行此操作,我需要使用不同的函数吗?

Could you please, give me more details: Which command can I use to read the system at time t_start and to simulate it for 0 seconds? To do this for both t_start and t_stop do I need to different function?

完成此操作后,我是否必须调用 OpenModelica 模型中我想知道其时间的函数(或多个函数)?

Once I have done this, do I have to call the function (or functions) inside the OpenModelica Model of which I want to know its time?

再次感谢您的宝贵帮助!

Thank you so much again for your precious help!

最好的问候,加布里埃尔

Very best regards, Gabriele

推荐答案

根据您拥有的工具,这可能意味着很多工作.

Depending on the tool you have, this could mean a lot of work.

第一个问题是 MSL 允许您检索系统时间,但没有包含任何内容来轻松计算时间增量.因此,Dymola 中的测试库具有操作员记录DateTimeDuration.请注意,我们计划将它们集成到未来的 MSL 版本中,但目前这只能通过 Dymola 用户的测试库使用.

The first problem is that the MSL allows you to retrieve the system time, but there is nothing included to easily compute time deltas. Therefore the Testing library in Dymola features the operator records DateTime and Duration. Note, that it is planned to integrate them in future MSL versions, but at the moment this is only available via the Testing library for Dymola users.

第二个问题是没有标准化的方法来翻译和模拟模型.每个工具都有自己的方法来从脚本中做到这一点.因此,在不知道您使用的是什么工具的情况下,无法给出准确答案.

The second problem is that there is no standardized way to translate and simulate models. Every tools has its own way to do that from scripts. So without knowing what tool you are using, it's not possible to give an exact answer.

在当前的 Modelica 标准库版本 3.2.3 中,您可以通过 Modelica.Utilities.System.getTime() 读取实际系统时间.

In the current Modelica Standard Library version 3.2.3 you can read the actual system time via Modelica.Utilities.System.getTime().

这个小例子展示了如何使用它:

This small example shows how to use it:

function printSystemTime
protected 
  Integer ms, s, min, h, d, mon, a;
algorithm 
  (ms, s, min, h, d, mon, a) := Modelica.Utilities.System.getTime();
  Modelica.Utilities.Streams.print("Current time is: "+String(h)+":"+String(min)+":"+String(s));
end printSystemTime;

您会看到它通过 7 个返回值给出当前系统日期和时间.如果您想计算时间增量,这些变量不太好处理,因为您最终会得到 14 个变量,每个变量都有自己的值范围.

You see it gives the current system date and time via 7 return values. These variables are not very nice to deal with if you want to compute a time delta, as you will end up with 14 variables, each with its own value range.

使用 gettime() 你可以构建一个函数:

With gettime() you can build a function that:

  1. 读取系统时间为 t_start
  2. 翻译模型并模拟 0 秒
  3. 再次读取系统时间并作为 t_stop
  4. 计算 t_start 和 t_stop 的差值.

第 2 步取决于工具.在 Dymola 中,您会调用

Step 2 depends on the tool. In Dymola you would call

DymolaCommands.SimulatorAPI.simulateModel("path-to-model", 0, 0);

翻译你的模型并模拟它 0 秒,所以它只运行初始化部分.

which translates your model and simulates it for 0 seconds, so it only runs the initialization section.

Testing 库包含函数 Testing.Utilities.Simulation.timing,它几乎完全符合您的要求.

The Testing library contains the function Testing.Utilities.Simulation.timing, which does almost exactly what you want.

要翻译和模拟您的模型,请按如下方式调用它:

To translate and simulate your model call it as follows:

Testing.Utilities.Simulation.timing(
  "Modelica.Blocks.Examples.PID_Controller", 
  task=Testing.Utilities.Simulation.timing.Task.fullTranslate_simulate, 
  loops=3);

这将转换您的模型并模拟 1 秒 3 次并计算平均值.

This will translate your model and simulate for 1 second three times and compute the average.

要模拟0s,复制函数并改变这个

To simulate for 0s, duplicate the function and change this

if simulate then
  _ :=simulateModel(c);
end if;

if simulate then
  _ :=simulateModel(c, 0, 0);
end if;

这篇关于Modelica 仿真和方程初始化的总时间计算的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆