Python Multiprocessing - 如何让进程在活动时等待? [英] Python Multiprocessing - how to make processes wait while active?

查看:91
本文介绍了Python Multiprocessing - 如何让进程在活动时等待?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

嗯,我对 python 和多处理很陌生,我需要知道是否有任何方法可以让活动进程等待诸如所有进程都已完成使用给定资源",然后继续他们的工作.是的,我真的需要他们等待,主要目的与同步有关.这不是关于完成进程并加入它们,而是关于在它们运行时等待,我应该使用条件/事件之类的东西吗?我在任何地方都找不到任何真正有用的东西.

Well, I'm quite new to python and multiprocessing, and what I need to know is if there is any way to make active processes wait for something like "all processes have finished using a given resource", then continue their works. And yes, I really need them to wait, the main purpose is related to synchronization. It's not about finishing the processes and joining them, it's about waiting while they're running, should I use something like a Condition/Event or something? I couldn't find anything really helpful anywhere.

应该是这样的:

import multiprocessing

def worker(args):
    #1. working
    #2. takes the resource from the manager
    #3. waits for all other processes to finish the same step above
    #4. returns to 1.

if __name__ == '__main__':
    manager = multiprocessing.Manager()
    resource = manager.something()
    pool = multiprocessing.Pool(n)
    result = pool.map(worker, args)
    pool.close()
    pool.join()

工作"部分比其他部分花费更多的时间,所以我仍然利用多处理,即使对单个资源的访问是串行的.假设问题是这样工作的:我有多个进程在运行解决方案查找器(一种进化算法),并且每生成n"个解决方案,我都会使用该资源在这些进程之间交换一些数据,并使用这些信息改进解决方案.所以,我需要他们所有人在交换信息之前等待.这有点难以解释,我并不是真的在这里讨论理论,我只是想知道是否有任何方法可以做到我在主要问题中尝试描述的内容.

The "working" part takes a lot more time than the other parts, so I still take advantage of multiprocessing, even if the access to that single resource is serial. Let's say the problem works this way: I have multiple processes running a solution finder (an evolutionary algorithm), and every "n" solutions made, I use that resource to exchange some data between those processes and improve solutions using the information. So, I need all of them to wait before exchanging that info. It's a little hard to explain, and I'm not really here to discuss the theory, I just want to know if there is any way I could do what I tried to describe in the main question.

推荐答案

我实际上找到了做我想做的事情的方法.正如您在问题中看到的,代码在流程中使用了一个管理器.所以,简单来说,我制作了一个共享资源,它的工作原理基本上类似于日志".每次进程完成其工作时,它都会在日志中写入一个权限.一旦所有所需的权限都存在,进程就会继续它们的工作(例如,我还可以使用它为资源设置特定的访问顺序).请注意,这不是锁或信号量.我想这根本不是一个好方法,但它适合问题的需要并且不会延迟执行.

I actually found out a way to do what I wanted. As you can see in the question, the code was using a manager along the processes. So, in simple words, I made a shared resource which works basically like a "Log". Every time a process finishes its work, it writes a permission in the log. Once all the desired permissions are there, the processes continue their works (also, using this, I could set specific orders of access for a resource, for example). Please note that this is not a Lock or a Semaphore. I suppose this isn't a good method at all, but it suits the problem's needs and doesn't delay the execution.

这篇关于Python Multiprocessing - 如何让进程在活动时等待?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆