IOError:无法读取数据(无法打开目录)-缺少gzip压缩过滤器 [英] IOError: Can't read data (Can't open directory) - Missing gzip compression filter

查看:208
本文介绍了IOError:无法读取数据(无法打开目录)-缺少gzip压缩过滤器的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我以前从未使用过HDF5文件,因此从入门开始,我收到了一些示例文件.我一直在使用h5py检查所有基础知识,查看这些文件中的不同组,它们的名称,键,值等.一切正常,直到我要查看保存在组中的数据集.我得到了它们的.shape.dtype,但是当我尝试通过索引访问随机值(例如grp["dset"][0])时,出现以下错误:

I have never worked with HDF5 files before, and to get started I received some example files. I've been checking out all the basics with h5py, looking at the different groups in these files, their names, keys, values and so on. Everything works fine, until I want to look at the datasets that are saved in the groups. I get their .shape and .dtype, but when I try accessing a random value by indexing (e.g. grp["dset"][0]), I get the following error:

IOError                                   Traceback (most recent call last)
<ipython-input-45-509cebb66565> in <module>()
      1 print geno["matrix"].shape
      2 print geno["matrix"].dtype
----> 3 geno["matrix"][0]

/home/sarah/anaconda/lib/python2.7/site-packages/h5py/_hl/dataset.pyc in __getitem__(self, args)
    443         mspace = h5s.create_simple(mshape)
    444         fspace = selection._id
--> 445         self.id.read(mspace, fspace, arr, mtype)
    446
    447         # Patch up the output for NumPy

/home/sarah/anaconda/lib/python2.7/site-packages/h5py/h5d.so in h5py.h5d.DatasetID.read (h5py/h5d.c:2782)()

/home/sarah/anaconda/lib/python2.7/site-packages/h5py/_proxy.so in h5py._proxy.dset_rw (h5py/_proxy.c:1709)()

/home/sarah/anaconda/lib/python2.7/site-packages/h5py/_proxy.so in h5py._proxy.H5PY_H5Dread (h5py/_proxy.c:1379)()

IOError: Can't read data (Can't open directory)

我已在 h5py Google网上论坛中发布了此问题,建议在我尚未安装的数据集上安装一个过滤器.但是据我所知,HDF5文件仅使用gzip压缩创建,这应该是可移植的标准.
有人知道我在这里可能会想念的吗?我什至在任何地方都找不到该错误或类似问题的描述,并且可以使用HDFView软件轻松打开该文件(包括有问题的数据集).

I've posted this problem in the h5py Google group, where it was suggested that there might be a filter on the dataset I don't have installed. But the HDF5 file was created using only gzip compression, which should be a portable standard, as far as I understood.
Does someone know what I might be missing here? I can't even find a description of this error or similar problems anywhere, and the file, including the problematic dataset, can be easily opened with the HDFView software.

修改
显然,由于某些原因,gzip压缩过滤器在我的系统上不可用,因此发生此错误.如果我尝试使用gzip压缩创建示例文件,则会发生这种情况:

Edit
Apparently, this error occurs because, for some reason, the gzip compression filter is not available on my system. If I try to create an example file with gzip compression, this happens:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-33-dd7b9e3b6314> in <module>()
      1 grp = f.create_group("subgroup")
----> 2 grp_dset = grp.create_dataset("dataset", (50,), dtype="uint8", chunks=True, compression="gzip")

/home/sarah/anaconda/lib/python2.7/site-packages/h5py/_hl/group.pyc in create_dataset(self, name, shape, dtype, data, **kwds)
     92         """
     93 
---> 94         dsid = dataset.make_new_dset(self, shape, dtype, data, **kwds)
     95         dset = dataset.Dataset(dsid)
     96         if name is not None:

/home/sarah/anaconda/lib/python2.7/site-packages/h5py/_hl/dataset.pyc in make_new_dset(parent, shape, dtype, data, chunks, compression, shuffle, fletcher32, maxshape, compression_opts, fillvalue, scaleoffset, track_times)
     97 
     98     dcpl = filters.generate_dcpl(shape, dtype, chunks, compression, compression_opts,
---> 99                   shuffle, fletcher32, maxshape, scaleoffset)
    100 
    101     if fillvalue is not None:

/home/sarah/anaconda/lib/python2.7/site-packages/h5py/_hl/filters.pyc in generate_dcpl(shape, dtype, chunks, compression, compression_opts, shuffle, fletcher32, maxshape, scaleoffset)
    101 
    102         if compression not in encode:
--> 103             raise ValueError('Compression filter "%s" is unavailable' % compression)
    104 
    105         if compression == 'gzip':

ValueError: Compression filter "gzip" is unavailable

有人有经验吗? HDF5库以及h5py软件包的安装似乎没有错...

Does anyone have experience with that? The installation of the HDF5 library as well as the h5py package didn't seem to go wrong...

推荐答案

不能只发表评论-信誉太低.

Can't just comment - reputation too low.

我遇到了同样的问题,只需运行"conda update anaconda",问题就解决了.

I had the same issue, simply ran "conda update anaconda" and the problem is gone.

这篇关于IOError:无法读取数据(无法打开目录)-缺少gzip压缩过滤器的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆