用h5py并行写入不同的组 [英] parallel write to different groups with h5py

查看:95
本文介绍了用h5py并行写入不同的组的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用并行h5py为每个进程创建一个独立的组,并用一些数据填充每个组.实际上,只有一个组被创建并填充了数据.这是程序:

I'm trying to use parallel h5py to create an independent group for each process and fill each group with some data.. what happens is that only one group gets created and filled with data. This is the program:

from mpi4py import MPI
import h5py

rank = MPI.COMM_WORLD.Get_rank()
f = h5py.File('parallel_test.hdf5', 'w', driver='mpio', comm=MPI.COMM_WORLD)

data = range(1000)

dset = f.create_dataset(str(rank), data=data)

f.close()

对这里出什么问题有任何想法吗?

Any thoughts on what is going wrong here?

非常感谢

推荐答案

好,所以如注释中所述,我必须为每个过程创建数据集,然后填充它们.倍于通讯器的大小:

Ok, so as mentioned in the comments I had to create the datasets for every process then fill them up.. The following code is writing data in parallel as many times as the size of the communicator:

comm = MPI.COMM_WORLD
rank = comm.Get_rank()
size = comm.Get_size()

data = [random.randint(1, 100) for x in range(4)]

f = h5py.File('parallel_test.hdf5', 'w', driver='mpio', comm=comm)

dset = []
for i in range(size):
   dset.append(f.create_dataset('test{0}'.format(i), (len(data),), dtype='i'))

dset[rank][:] = data
f.close()

这篇关于用h5py并行写入不同的组的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆