使用psql在本地下载/保存之前,将gzip表转换为AWS实例上的csv [英] gzip table to csv on AWS instance, prior to downloading / saving locally with psql

查看:97
本文介绍了使用psql在本地下载/保存之前,将gzip表转换为AWS实例上的csv的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是PostgreSQL和psql CLI的新手。我的带宽非常有限,这导致从一个AWS实例下载每个表需要花费数小时,每个表需要1-5 GB。使用psql登录数据库后,我使用的当前命令是:

I'm new to PostgreSQL and the psql CLI. My bandwidth is extremely limited, which results in it taking hours to download each table from an AWS instance, that are 1 - 5 GB's each. The current command I use, after logging into the DB with psql:

\copy(SELECT * FROM table)到table.csv CSV DELIMITER ','

是否可以查询与上面类似的表,该表实际上将csv文件压缩到Amazon PostgreSQL实例上,在下载并本地保存之前,因此将1-5 GB的下载量减少到< 1 GB;

Is it possible to query a table, similar to the above, that actually zips the csv file ON the Amazon PostgreSQL instance, prior to downloading and saving locally, thus reducing the 1 - 5 GB downloads to < 1 GB; significantly reducing the download times?

类似这样的东西:

\copy(选择* FROM table)TO csv.zip CSV DELIMITER','TO table.csv.zip

我遇到了这个要点,但列出的命令似乎是所有表/整个数据库的完整转储。我希望能够对表和子集查询执行相同的操作。

I came across this gist, but the commands listed appear to be a complete dump of all tables / the entire db. I would like the ability to do the same for tables and subset queries.

编辑:解决方案= \copy(选择*从表)到程序'gzip>登录到psql后,用户/用户名/文件夹/文件夹/my_table.gz'DELIMITER','

Solution = \copy (SELECT * FROM table) TO PROGRAM 'gzip > Users/username/folder/folder/my_table.gz' DELIMITER ',' after logging into psql

推荐答案

使用 psql STDOUT 。此命令会将输出返回到 client 并将其压缩:

Using psql and the STDOUT. This command will return the output to the client and will compress it:

psql yourdb -c "\COPY (SELECT * FROM table) TO STDOUT;" | gzip > output.gz

或直接在数据库服务器(也可以压缩到

Or directly at the database server (also into a compressed file), using a client of your choice:

COPY (SELECT * FROM table) TO PROGRAM 'gzip > /var/lib/postgresql/my_table.gz' DELIMITER ',';

这篇关于使用psql在本地下载/保存之前,将gzip表转换为AWS实例上的csv的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆