通过全局名称空间与作为函数参数共享同步对象 [英] Sharing synchronization objects through global namespace vs as a function argument
问题描述
如果我需要共享multiprocessing.Queue
或multiprocessing.Manager
(或任何其他同步原语),则在全局(模块)级别定义它们,而不是将它们作为在不同过程中执行的函数的参数?
If I need to share a multiprocessing.Queue
or a multiprocessing.Manager
(or any of the other synchronization primitives), is there any difference in doing it by defining them at the global (module) level, versus passing them as an argument to the function executed in a different process?
例如,以下三种我可以想象可以共享队列的方式:
For example, here are three possible ways I can imagine a queue could be shared:
# works fine on both Windows and Linux
from multiprocessing import Process, Queue
def f(q):
q.put([42, None, 'hello'])
def main():
q = Queue()
p = Process(target=f, args=(q,))
p.start()
print(q.get()) # prints "[42, None, 'hello']"
p.join()
if __name__ == '__main__':
main()
vs.
# works fine on Linux, hangs on Windows
from multiprocessing import Process, Queue
q = Queue()
def f():
q.put([42, None, 'hello'])
def main():
p = Process(target=f)
p.start()
print(q.get()) # prints "[42, None, 'hello']"
p.join()
if __name__ == '__main__':
main()
vs.
# works fine on Linux, NameError on Windows
from multiprocessing import Process, Queue
def f():
q.put([42, None, 'hello'])
def main():
p = Process(target=f)
p.start()
print(q.get()) # prints "[42, None, 'hello']"
p.join()
if __name__ == '__main__':
q = Queue()
main()
哪种方法正确?我从实验中猜测这只是第一个,但想确认它是正式情况(不仅对于Queue
,而且对于Manager
和其他类似对象).
Which the correct approach? I'm guessing from my experimentation that it's only the first one, but wanted to confirm it's officially the case (and not only for Queue
but for Manager
and other similar objects).