使用h5py以写入模式打开已打开的hdf5文件 [英] Opening already opened hdf5 file in write mode, using h5py

查看:132
本文介绍了使用h5py以写入模式打开已打开的hdf5文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我同时运行同一Python程序和不同进程,所有这些程序都希望使用 h5py Python包写入同一 hdf5 文件.但是,只有一个进程可以在写入模式下打开给定的文件,否则会出现错误

I run the same Python program concurrently as different processes, and these all want to write to the same hdf5 file, using the h5py Python package. However, only a single process may open a given hdf5 file in write mode, otherwise you will get the error

OSError:无法打开文件(无法锁定文件,errno = 11,错误message ='资源暂时不可用')

OSError: Unable to open file (unable to lock file, errno = 11, error message = 'Resource temporarily unavailable')

在处理上述异常期间,发生了另一个异常:

During handling of the above exception, another exception occurred:

OSError:无法创建文件(无法打开文件:name ='test.hdf5',errno = 17,错误消息='文件存在',标志= 15o_flags = c2)

OSError: Unable to create file (unable to open file: name = 'test.hdf5', errno = 17, error message = 'File exists', flags = 15, o_flags = c2)

我想通过检查文件是否已在写入模式下打开来解决此问题,如果是,请稍等一下,然后再次检查,直到不再在写入模式下打开文件为止.我没有找到 h5py hdf5 的任何此类检查功能.到目前为止,我的解决方案基于以下内容:

I want to resolve this by checking whether the file is already opened in write mode, and if so, wait a bit and check again, until it is no longer opened in write mode. I have not found any such checking capability of h5py or hdf5. As of now, my solution is based on this:

from time import sleep
import h5py

# Function handling the intelligent hdf5 file opening
def open_hdf5(filename, *args, **kwargs):
    while True:
        try:
            hdf5_file = h5py.File(filename, *args, **kwargs)
            break  # Success!
        except OSError:
            sleep(5)  # Wait a bit
    return hdf5_file

# How to use the function
with open_hdf5(filename, mode='a') as hdf5_file:
    # Do stuff
    ...

我不确定我是否喜欢这个,因为它看起来并不温柔.有没有更好的方法可以做到这一点?我在 try 内错误地打开文件的尝试是否会以某种方式破坏其他进程中正在进行的写入进程?

I'm unsure whether I like this, as it doesn't seem very gentle. Are there any better way of doing this? Are there any change that my erroneous attempts to open the file inside the try can somehow corrupt the write process that is going on in the other process?

推荐答案

通过快速研究判断,没有独立于平台的方法来检查文件是否已为开放写入模式.如何检查是否一个文件is_open和python中的open_status https://bytes.com/topic/python/answers/612924-how-check-whether-file-open-not

Judging by a quick research there is no platform independent way of checking if a file is already is open write mode. How to check whether a file is_open and the open_status in python https://bytes.com/topic/python/answers/612924-how-check-whether-file-open-not

但是,由于您已经定义了用于打开写入hdf5文件的包装程序的开放式读/写方法,因此,当您有一个成功打开hdf5文件的进程时,始终可以创建一个"file_name" .lock文件..

However since you have defined a wrapper open read/write methods for reading writing your hdf5 file you can always create a "file_name".lock file when you have one process that succeeded in opening the hdf5 file.

然后您要做的就是使用os.path.exists('"file_name" .lock')来了解是否可以在写入模式下打开文件.

Then all you have to do is use os.path.exists('"file_name".lock') to know if you can open the file in write mode.

本质上,您所做的工作并没有太大不同.但是,首先只是您可以查看文件系统,以查看您的进程中是否有一个以写模式访问文件,其次,测试不是异常的产物,因为os.path.exists将返回布尔值.

Essentially it is not very different for what you do. However first it's just you can look in your filesytem to see whether one of your process accesses in write mode the file, second the test is not the product of an exception since os.path.exists will return a boolean.

许多应用程序都使用这种技巧.在CVS存储库中漫游时,您经常会看到.lock文件在周围...

Many applications use this kind of trick. When roaming through CVS repo you often see .lock files lying around...

这篇关于使用h5py以写入模式打开已打开的hdf5文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆