python子进程和mysqldump [英] python subprocess and mysqldump

查看:84
本文介绍了python子进程和mysqldump的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我知道之前已经有人问过这个问题的一部分,但是我有一些相关的问题.

I know parts of this question have been asked before, but I have some related questions.

我正在尝试执行

mysqldump -u uname -ppassword --add-drop-database --databases databaseName | gzip > fileName

我可能正在转储非常大的数据库(200GB?).这本身就是愚蠢的事情吗?然后,我想通过网络发送压缩文件以进行存储,删除本地转储并清除几个表.

I'm potentially dumping a very large (200GB?) db. Is that in itself a dumb thing to do? I then want to send the zipped file over the network for storage, delete the local dump, and purge a couple of tables.

无论如何,我正在使用这样的子流程,因为似乎没有一种方法可以在不考虑子流程的情况下执行整个原始调用.成为表名.

Anyway, I was using subprocess like this, because there doesn't seem to be a way to execute the entire original call without subprocess considering | to be a table name.:

from subprocess import Popen, PIPE

f = open(FILENAME, 'wb')
args = ['mysqldump', '-u', 'UNAME', '-pPASSWORD', '--add-drop-database', '--databases', 'DB']

p1 = Popen(args, stdout=PIPE)
P2 = Popen('gzip', stdin=p1.stdout, stdout=f)
p2.communicate()

,但随后我读到该通讯将数据缓存在内存中,这对我不起作用.这是真的?

but then I read that communicate caches the data in memory, which wouldn't work for me. Is this true?

我现在最终要做的是:

import gzip
subprocess.call(args, stdout=f)
f.close()

f = open(filename, 'rb')
zipFilename = filename + '.gz'
f2 = gzip.open(zipFilename, 'wb')
f2.writelines(f)
f2.close()
f.close()

这当然需要一百万年,我讨厌它.

of course this takes a million years, and I hate it.

我的问题: 1.我可以在非常大的数据库上使用我的第一种方法吗? 2.是否可以将mysqldump的输出通过管道传输到套接字并在网络上触发并在到达时进行保存,而不是发送压缩文件?

My Questions: 1. Can I use my first approach on a very large db? 2. Could I possibly pipe the output of mysqldump to a socket and fire it across the network and save it when it arrives, rather than sending a zipped file?

谢谢!

推荐答案

您不需要communication().如果您想读取stdout/stderr以完成操作,它仅是一种方便的方法.但是,由于您要链接命令,因此它们会为您完成这些操作.只需等待它们完成即可.

You don't need communicate(). Its only there as a convenience method if you want to read stdout/stderr to completion. But since you are chaining the commands, they are doing that for you. Just wait for them to complete.

from subprocess import Popen, PIPE

args = ['mysqldump', '-u', 'UNAME', '-pPASSWORD', '--add-drop-database', '--databases', 'DB']

with open(FILENAME, 'wb', 0) as f:
    p1 = Popen(args, stdout=PIPE)
    p2 = Popen('gzip', stdin=p1.stdout, stdout=f)
p1.stdout.close() # force write error (/SIGPIPE) if p2 dies
p2.wait()
p1.wait()

这篇关于python子进程和mysqldump的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆