来自Abaqus/CAE的Python多重处理 [英] Python multiprocessing from Abaqus/CAE

查看:303
本文介绍了来自Abaqus/CAE的Python多重处理的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用带有内置Python 2.6解释器和API的名为Abaqus/CAE 1 的商业应用程序.我已经开发了一个长时间运行的脚本,尝试使用Python的multiprocessing模块拆分为多个同时执行的独立任务.但是,一旦产生,进程就会挂起.

脚本本身使用各种对象/方法,这些对象/方法仅可通过Abaqus的专有cae模块获得,只有通过启动与Abaqus/CAE捆绑在一起的Python,然后通过Python的execfile执行我的脚本,才能加载该对象/方法. /p>

为了尝试进行多处理,我尝试运行一个避免访问任何Abaqus对象的脚本,而是仅执行计算并将结果打印到文件 2 .这样,我可以从常规系统Python安装以及与Abaqus捆绑在一起的Python中运行相同的脚本.

使用以下任一命令从命令行运行时,以下示例代码可以按预期工作:

C:\some\path>python multi.py         # <-- Using system Python
C:\some\path>abaqus python multi.py  # <-- Using Python bundled with Abaqus

这产生了新的进程,并且每个进程都运行该函数并将结果按预期方式写入文件.但是,在Abaqus/CAE Python环境中使用以下命令调用时:

abaqus cae noGUI=multi.py

Abaqus随后将启动,自动导入其专有模块,然后使用以下命令执行我的文件:

execfile("multi.py", __main__.__dict__)

其中全局名称空间arg __main__.__dict__由Abaqus设置.然后,Abaqus成功为每个进程签出许可证,产生新的进程,就这样.进程已创建,但是它们都挂起并且什么也不做.没有错误消息.

是什么原因导致挂断,如何解决?是否有必须设置的环境变量?是否有其他商业系统使用类似的程序,我可以从中学习或模仿?

请注意,任何解决方案都必须在解决方案

由于我无法发表评论,因此我必须写一个答案.

我可以想象的原因是python多处理程序使用其自己的非共享内存产生了一个全新的进程.因此,如果您在脚本中创建一个对象,则启动一个新进程,该新进程包含内存的副本,并且您有两个对象可以进入不同的方向.当原始python进程中存在某种abaqus(我怀疑)时,它也会被复制,而此副本可能会产生这种行为.

作为一种解决方案,我认为您可以用C扩展python (可以在一个进程中使用多个内核)并在那里使用线程.

I am using a commercial application called Abaqus/CAE1 with a built-in Python 2.6 interpreter and API. I've developed a long-running script that I'm attempting to split into simultaneous, independent tasks using Python's multiprocessing module. However, once spawned the processes just hang.

The script itself uses various objects/methods available only through Abaqus's proprietary cae module, which can only be loaded by starting up the Python bundled with Abaqus/CAE first, which then executes my script with Python's execfile.

To try to get multiprocessing working, I've attempted to run a script that avoids accessing any Abaqus objects, and instead just performs a calculation and prints the result to file2. This way, I can run the same script from the regular system Python installation as well as from the Python bundled with Abaqus.

The example code below works as expected when run from the command line using either of the following:

C:\some\path>python multi.py         # <-- Using system Python
C:\some\path>abaqus python multi.py  # <-- Using Python bundled with Abaqus

This spawns the new processes, and each runs the function and writes the result to file as expected. However, when called from the Abaqus/CAE Python environment using:

abaqus cae noGUI=multi.py

Abaqus will then start up, automatically import its own proprietary modules, and then executes my file using:

execfile("multi.py", __main__.__dict__)

where the global namespace arg __main__.__dict__ is setup by Abaqus. Abaqus then checks out licenses for each process successfully, spawns the new processes, and ... and that's it. The processes are created, but they all hang and do nothing. There are no error messages.

What might be causing the hang-up, and how can I fix it? Is there an environment variable that must be set? Are there other commercial systems that use a similar procedure that I can learn from/emulate?

Note that any solution must be available in the Python 2.6 standard library.

System details: Windows 10 64-bit, Python 2.6, Abaqus/CAE 6.12 or 6.14

Example Test Script:

# multi.py
import multiprocessing
import time

def fib(n):
    a,b = 0,1
    for i in range(n):
        a, b = a+b, a
    return a

def workerfunc(num):
    fname = ''.join(('worker_', str(num), '.txt'))
    with open(fname, 'w') as f:
        f.write('Starting Worker {0}\n'.format(num))
        count = 0
        while count < 1000:  # <-- Repeat a bunch of times.
            count += 1
            a=fib(20)
        line = ''.join((str(a), '\n'))
        f.write(line)
        f.write('End Worker {0}\n'.format(num))

if __name__ == '__main__':
    jobs = []
    for i in range(2):       # <-- Setting the number of processes manually
        p = multiprocessing.Process(target=workerfunc, args=(i,))
        jobs.append(p)
        print 'starting', p
        p.start()
        print 'done starting', p
    for j in jobs:
        print 'joining', j
        j.join()
        print 'done joining', j

1A widely known finite element analysis package

2The script is a blend of a fairly standard Python function for fib(), and examples from PyMOTW

解决方案

I have to write an answer as I cannot comment yet.

What I can imagine as a reason is that python multiprocessing spawns a whole new process with it's own non-shared memory. So if you create an object in your script, the start a new process, that new process contains a copy of the memory and you have two objects that can go into different directions. When something of abaqus is present in the original python process (which I suspect) that gets copied too and this copy could create such a behaviour.

As a solution I think you could extend python with C (which is capable to use multiple cores in a single process) and use threads there.

这篇关于来自Abaqus/CAE的Python多重处理的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆