我可以在python中创建共享的多数组或列表对象列表以进行多处理吗? [英] Can I create a shared multiarray or lists of lists object in python for multiprocessing?

查看:157
本文介绍了我可以在python中创建共享的多数组或列表对象列表以进行多处理吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要创建一个多维数组或列表列表的共享对象,以便其他进程可以使用它.有没有办法创建它,因为我所看到的是不可能的.我已经尝试过:

I need to make a shared object of a multidimensional array or list of lists for it to be available to the other processes. Is there a way to create it as for what i have seen it is not possible. I have tried:

from multiprocessing import Process, Value, Array
arr = Array('i', range(10))
arr[:]
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
arr[2]=[12,43]
TypeError: an integer is required

我听说numpy数组可以是多数组和共享对象,如果上面不可能,有人可以告诉我如何使numpy数组成为共享对象吗?

I heard numpy array can be multiarray and a shared object, if above is not possible can someone tell me how to make a numpy array a shared object??

推荐答案

使numpy数组成为共享对象(完整示例):

To make a numpy array a shared object (full example):

import ctypes as c
import numpy as np
import multiprocessing as mp

n, m = 2, 3
mp_arr = mp.Array(c.c_double, n*m) # shared, can be used from multiple processes
# then in each new process create a new numpy array using:
arr = np.frombuffer(mp_arr.get_obj()) # mp_arr and arr share the same memory
# make it two-dimensional
b = arr.reshape((n,m)) # b and arr share the same memory

如果您不需要 shared (如共享同一内存"中的对象),并且仅一个可以在多个进程中使用的对象就足够了,则可以使用multiprocessing.Manager:

If you don't need a shared (as in "share the same memory") object and a mere object that can be used from multiple processes is enough then you could use multiprocessing.Manager:

from multiprocessing import Process, Manager

def f(L):
    row = L[0] # take the 1st row
    row.append(10) # change it
    L[0] = row #NOTE: important: copy the row back (otherwise parent
               #process won't see the changes)

if __name__ == '__main__':
    manager = Manager()

    lst = manager.list()
    lst.append([1])
    lst.append([2, 3])
    print(lst) # before: [[1], [2, 3]]

    p = Process(target=f, args=(lst,))
    p.start()
    p.join()

    print(lst) # after: [[1, 10], [2, 3]]

来自文档:

服务器进程管理器比使用共享内存更灵活 对象,因为可以使它们支持任意对象类型. 同样,一个经理可以由不同的进程共享 网络上的计算机.但是,它们比使用共享的速度慢 记忆.

Server process managers are more flexible than using shared memory objects because they can be made to support arbitrary object types. Also, a single manager can be shared by processes on different computers over a network. They are, however, slower than using shared memory.

这篇关于我可以在python中创建共享的多数组或列表对象列表以进行多处理吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆