自动或定期备份mysql数据 [英] Automated or regular backup of mysql data

查看:230
本文介绍了自动或定期备份mysql数据的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想使用< insert favorite PHP framework here>定期备份我的mysql数据库中的一些表。 / plain php /我的第二喜欢的语言。我想要它是自动的,以便备份可以恢复以后的情况下发生错误。

I want to take regular backups of some tables in my mysql database using <insert favorite PHP framework here> / plain php / my second favorite language. I want it to be automated so that the backup can be restored later on in case something goes wrong.

我尝试执行查询并将结果保存到文件。结束了看起来有点像这样的代码。

I tried executing a query and saving the results to a file. Ended up with code that looks somewhat like this.

$sql = 'SELECT * FROM my_table ORDER id DESC';
$result = mysqli_query( $connect, $sql );  
if( mysqli_num_rows( $result ) > 0){

    $output=fopen('/tmp/dumpfile.csv','w+');

    /* loop through recordset and add that to the file */
    while( $row = mysqli_fetch_array( $result ) ) {
        fputcsv( $output, $row, ',', '"');
    }

    fclose( $output );
}

我在我的本地机器上设置一个cron作业,用这个代码访问网页,我也尝试在服务器上编写一个cronjob作为CLI运行脚本,但它导致所有这些包括

I set up a cron job on my local machine to hit the web page with this code. I also tried writing a cronjob on the server run the script as a CLI. But it's causing all sorts of problems. These include


  1. 有时数据不一致

  2. 文件似乎

  3. 有时脚本会超时

$ b
$ b

我也听说过 mysqldump ,我试着用 exec 它会产生错误。

I have also heard about mysqldump. I tried to run it with exec but it produces an error.

我如何解决这个问题?

推荐答案

CSV和SELECT INTO OUTFILE



http://dev.mysql.com/doc/refman/5.7/en/select-into.html


SELECT ... INTO OUTFILE将所选行写入文件。

SELECT ... INTO OUTFILE writes the selected rows to a file. Column and line terminators can be specified to produce a specific output format.

下面是一个完整的例子:

Here is a complete example:

SELECT * FROM my_table INTO OUTFILE '/tmp/my_table.csv'
  FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
  LINES TERMINATED BY '\n'
  FROM test_table;

文件保存在服务器上,选择的路径需要是可写的,虽然这个查询可以通过PHP和web请求来执行,但最好通过mysql控制台执行。

The file is saved on the server and the path chosen needs to be writable. Though this query can be executed through PHP and a web request, it is best executed through the mysql console.

以这种方式导出的数据可以使用 LOAD DATA INFILE

The data that's exported in this manner can be imported into another database using LOAD DATA INFILE

虽然这种方法对结果集进行优先迭代,并逐行保存到文件,但使用....

While this method is superior iterating through a result set and saving to a file row by row, it's not as good as using....

mysqldump 在许多方面优于SELECT INTO OUTFILE,生成CSV只是这个命令的很多事情之一可以做。

mysqldump is superior to SELECT INTO OUTFILE in many ways, producing CSV is just one of the many things that this command can do.


mysqldump客户端实用程序执行逻辑备份,生成一组
的SQL语句,原始
数据库对象定义和表数据。它转储一个或多个MySQL
数据库用于备份或传输到另一个SQL服务器。 mysqldump
命令还可以生成CSV,其他分隔文本或XML
格式的输出。

The mysqldump client utility performs logical backups, producing a set of SQL statements that can be executed to reproduce the original database object definitions and table data. It dumps one or more MySQL databases for backup or transfer to another SQL server. The mysqldump command can also generate output in CSV, other delimited text, or XML format.

mysqldump应该从你的shell中调用。可以使用php中的 exec 运行它,但是因为生成转储可能需要很长时间,这取决于数据量,而php脚本通常只运行30秒,您需要将其作为后台进程运行。

Ideally mysqldump should be invoked from your shell. It is possible to use exec in php to run it but since producing the dump might take a long time depending on the amount of data, and php scripts usually run only for 30 seconds, you would need to run it as a background process.

mysqldump不是没有公平的问题。


它不打算作为一个快速或可扩展的解决方案来备份
大量的数据。对于大数据大小,即使备份
步需要一个合理的时间,恢复数据可能非常慢
因为重放SQL语句涉及用于插入的磁盘I / O,
索引创建,

It is not intended as a fast or scalable solution for backing up substantial amounts of data. With large data sizes, even if the backup step takes a reasonable time, restoring the data can be very slow because replaying the SQL statements involves disk I/O for insertion, index creation, and so on.

一个典型的例子,请看这个问题:使用python的MySQL备份的服务器崩溃其中一个mysqldump似乎在早前的一个完成之前启动,并使网站完全无响应。

A classic example see this question: Server crash on MySQL backup using python where one mysqldump seems to start before the earlier one has finished and rendered the website completely unresponsive.


复制可以从一个MySQL数据库服务器)
复制到一个或多个MySQL数据库服务器(从属)。
默认情况下复制是异步的;从站不需要永久连接
以接收来自主站的更新。根据
的配置,您可以复制所有数据库,选择的
数据库,甚至数据库中的选定表。

Replication enables data from one MySQL database server (the master) to be copied to one or more MySQL database servers (the slaves). Replication is asynchronous by default; slaves do not need to be connected permanently to receive updates from the master. Depending on the configuration, you can replicate all databases, selected databases, or even selected tables within a database.

因此,复制的操作方式与 SELECT INTO OUTFILE msyqldump 这是理想的保持本地副本中的数据几乎是最新的(将完全同步,但有一些东西叫slave lag)另一方面,如果你使用一个计划任务运行 mysqldump 每24小时一次。想象一下,如果服务器在23小时后崩溃,会发生什么情况?

Thus replication operates differently from SELECT INTO OUTFILE or msyqldump It's ideal keeping the data in the local copy almost upto date (Would have said perfectly in sync but there is something called slave lag) On the other hand if you use a scheduled task to run mysqldump once every 24 hours. Imagine what can happen if the server crashes after 23 hours?

每次运行mysqldump时,都会生成大量数据,硬盘填满或您的文件存储账单击中屋顶。使用复制只将更改传递到服务器(通过使用所谓的binlog)

Each time you run mysqldump you are producing a large amount of data, keep doing it regularly and you will find your hard disk filled up or your file storage bills are hitting the roof. With replication only the changes are passed on to the server (by using the so called binlog)

复制的替代方法是使用 Percona XtraBackup

An alternative to replication is to use Percona XtraBackup.


Percona XtraBackup是一个用于MySQL的开源热备份实用程序 - 基于
的服务器,备份。

Percona XtraBackup is an open-source hot backup utility for MySQL - based servers that doesn’t lock your database during the backup.

虽然Percona,它与Mysql和Mariadb兼容。它有能力做增量备份缺乏这是mysqldump的最大限制。

Though by Percona, it's compatible with Mysql and Mariadb. It has the ability to do incremental backups lack of which is the biggest limitation of mysqldump.

这篇关于自动或定期备份mysql数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆