关闭打开的h5py数据文件 [英] Close an open h5py data file
问题描述
在我们的实验室中,我们将数据存储在 hdf5
文件中,通过python包 h5py
。
In our lab we store our data in hdf5
files trough the python package h5py
.
在实验开始时,我们在文件中的数组数组之后创建一个 hdf5
文件并存储数组(除其他外)。当实验失败或被中断时,文件未正确关闭。
因为我们的实验是从 iPython
运行的,所以对数据对象的引用仍在(某处)内存中。
At the beginning of an experiment we create an hdf5
file and store array after array of array of data in the file (among other things). When an experiment fails or is interrupted the file is not correctly closed.
Because our experiments run from iPython
the reference to the data object remains (somewhere) in memory.
有没有办法扫描所有打开的h5py数据对象并关闭它们?
Is there a way to scan for all open h5py data objects and close them?
推荐答案
这是怎么做的(我无法弄清楚如何在没有例外的情况下检查文件的封闭性,也许你会发现):
This is how it could be done (I could not figure out how to check for closed-ness of the file without exceptions, maybe you will find):
import gc
for obj in gc.get_objects(): # Browse through ALL objects
if isinstance(obj, h5py.File): # Just HDF5 files
try:
obj.close()
except:
pass # Was already closed
另一个想法:
使用 如何使用文件,使用上下文管理器和与
这样的关键字怎么样?
Dpending how you use the files, what about using the context manager and the with
keyword like this?
with h5py.File("some_path.h5") as f:
f["data1"] = some_data
当程序流退出with-block时,无论发生什么情况,文件都会关闭,包括异常等。
When the program flow exits the with-block, the file is closed regardless of what happens, including exceptions etc.
这篇关于关闭打开的h5py数据文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!