COM preSS在S3上的文件 [英] Compress file on S3

查看:168
本文介绍了COM preSS在S3上的文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在S3上17.7GB文件。它被产生为一个蜂房查询的输出,并且它不是玉米pressed

I have a 17.7GB file on S3. It was generated as the output of a Hive query, and it isn't compressed.

我想说的是,COM pressing它,这将是约2.2GB(GZIP)。我怎样才能下载此文件在本地尽快当传送瓶颈(250KB /秒)。

I know that by compressing it, it'll be about 2.2GB (gzip). How can I download this file locally as quickly as possible when transfer is the bottleneck (250kB/s).

我还没有发现任何简单的方法来COM preSS S3的文件,或转移s3cmd,博托,或相关工具的启用COM pression。

I've not found any straightforward way to compress the file on S3, or enable compression on transfer in s3cmd, boto, or related tools.

推荐答案

S3不支持流融为一体pression也不可能对COM preSS上传的文件进行远程。

S3 does not support stream compression nor is it possible to compress the uploaded file remotely.

如果这是一个一次性的过程中,我建议您下载到一个EC2机在同一地区,玉米preSS在那里,然后上传到你的目的地。

If this is a one-time process I suggest downloading it to a EC2 machine in the same region, compress it there, then upload to your destination.

<一个href="http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/EC2_GetStarted.html">http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/EC2_GetStarted.html

如果你需要这个更频繁地

If you need this more frequently

<一个href="http://stackoverflow.com/questions/5442011/serving-gzipped-css-and-javascript-from-amazon-cloudfront-via-s3">Serving通过S3

这篇关于COM preSS在S3上的文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆