如何使Zlib从Ruby中的S3流解压缩? [英] How do I get Zlib to uncompress from S3 stream in Ruby?

查看:58
本文介绍了如何使Zlib从Ruby中的S3流解压缩?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

Ruby Zlib :: GzipReader 并传递类似IO的对象(必须具有与 IO#read 行为相同的read方法).

Ruby Zlib::GzipReader should be created passing an IO-like object (must have a read method that behaves same as the IO#read).

我的问题是我无法从 AWS :: S3 lib中获得类似IO的对象.据我所知,从中获取流的唯一方法是将一个块传递给 S3Object#stream .

My problem is that I can't get this IO-like object from AWS::S3 lib. As far as I know, the only way of having a stream from it is passing a block to S3Object#stream.

我已经尝试过:

Zlib::GzipReader.new(AWS::S3::S3Object.stream('file', 'bucket'))
# Wich gaves me error: undefined method `read' for #<AWS::S3::S3Object::Value:0x000000017cbe78>

有人知道我该怎么做到吗?

Does anybody know how can I achieve it?

推荐答案

一个简单的解决方案是将下载的数据写入

A simple solution would be to write the downloaded data to a StringIO, then read it back out:

require 'stringio'

io = StringIO.new
io.write AWS::S3::S3Object.value('file', 'bucket')
io.rewind

gz = Zlib::GzipReader.new(io)
data = gz.read
gz.close

# do something with data ...

更精致的方法是在流仍在下载时开始填充压缩后的数据,这可以通过 IO.pipe 来实现.与此类似:

A more elaborate way would be to start inflating the gzipped data while the stream is still downloading, which can be achieved with an IO.pipe. Something along the lines of this:

reader, writer = IO.pipe

fork do
  reader.close
  AWS::S3::S3Object.stream('file', 'bucket') do |chunk|
    writer.write chunk
  end
end

writer.close

gz = Zlib::GzipReader.new(reader)
while line = gz.gets
  # do something with line ...
end

gz.close

您也可以使用 Thread 代替 fork :

reader, writer = IO.pipe

thread = Thread.new do
  AWS::S3::S3Object.stream('file', 'bucket') do |chunk|
    writer.write chunk
  end
  writer.close
end

gz = Zlib::GzipReader.new(reader)
while line = gz.gets
  # do something with line
end

gz.close
thread.join

这篇关于如何使Zlib从Ruby中的S3流解压缩?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆