使用h5py压缩现有文件 [英] Compression of existing file using h5py

查看:79
本文介绍了使用h5py压缩现有文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我目前正在从事有关HDF5数据集压缩的项目,并且最近开始使用h5py.我遵循了基本教程,并且能够在创建文件时打开,创建和压缩文件.但是,在压缩现有文件方面,我一直没有成功(这是我的工作目标).

我尝试使用'r +'打开文件,然后压缩分块数据集,但文件大小保持不变.

关于使用什么命令的任何建议,或者我会以错误的方式处理事情?

解决方案

HDF组提供了一组工具来转换,显示,分析,编辑和重新打包您的HDF5文件.

您可以使用 h5repack 实用程序压缩现有的hdf5文件.您也可以使用相同的实用程序更改块大小.

可以从命令行使用

h5repack.

h5repack file1 file2 //删除文件1的已占用空间并将其另存为file2.

h5repack -v -l CHUNK = 1024 file1 file2 //将1024的分块应用于file1

h5repack -v -l CHUNK = 1024 GZIP = 5 file1 file2 //创建1024个块并将其压缩使用GZIP 5级压缩

h5repack --help \获取有效的帮助文档

详细的文档也可用./p>

I'm currently working on a project regarding compression of HDF5 datasets and recently began using h5py. I followed the basic tutorials and was able to open,create and compress a file while it was being created. However, I've been unsuccessful when it comes to compressing an existing file (which is the aim of my work).

I've tried opening files using 'r+' and then compressing chunked datasets but the file sizes have remained the same.

Any suggestions on what commands to use or am I going about things the wrong way?

解决方案

HDF group provides a set of tools to convert, display, analyze and edit and repack your HDF5 file.

You can compress the existing hdf5 file using the h5repack utility. You can also change the chunk size using the same utility.

h5repack can used from command line.

h5repack file1 file2 //removes the accounted space of file 1 and saves it as file2.

h5repack -v -l CHUNK=1024 file1 file2 //Applies chunking of 1024 to the file1

h5repack -v -l CHUNK=1024 GZIP=5 file1 file2 //makes chunks of 1024 and compresses it with GZIP level 5 compression

h5repack --help \gets avalable help documentation

Detailed documentation is also available.

这篇关于使用h5py压缩现有文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆