使用Zlib for gzip在ruby中压缩大文件 [英] Compress large file in ruby with Zlib for gzip

查看:84
本文介绍了使用Zlib for gzip在ruby中压缩大文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个很大的文件,大约. 2亿行数据.

I have a very large file, approx. 200 million rows of data.

我想用Zlib库压缩它,特别是使用Writer.

I would like to compress it with the Zlib library, specifically using the Writer.

一次读完每一行似乎需要花费很多时间.有没有更好的方法可以做到这一点?

Reading through each line one at at time seems like it would take quite a bit of time. Is there a better way to accomplish this?

这是我现在拥有的:

require 'zlib'

Zlib::GzipWriter.open('compressed_file.gz') do |gz|
 File.open(large_data_file).each do |line|
   gz.write line
 end
 gz.close
end

推荐答案

您可以使用IO#read从文件中读取任意长度的块.

You can use IO#read to read a chunk of arbitrary length from the file.

require 'zlib'

Zlib::GzipWriter.open('compressed_file.gz') do |gz|
 File.open(large_data_file) do |fp|
   while chunk = fp.read(16 * 1024) do
     gz.write chunk
   end
 end
 gz.close
end

这将以16kb的块读取源文件,并将每个压缩的块添加到输出流.根据您的环境,根据自己的喜好调整块大小.

This will read the source file in 16kb chunks and add each compressed chunk to the output stream. Adjust the block size to your preference based on your environment.

这篇关于使用Zlib for gzip在ruby中压缩大文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆