在Laravel上将一百万行数据从一个数据库复制到另一个数据库 [英] Copying million rows of data from one database to another on laravel

查看:420
本文介绍了在Laravel上将一百万行数据从一个数据库复制到另一个数据库的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

使用Laravel Eloquent,我从旧的Mysql数据库上的一个表中复制了700万行数据,并将这些行放在新的Mysql数据库上的不同表上.问题是执行此操作几乎花了一天的时间,我需要对将近80M的行重新执行此操作.我一次使用的是1000个数据块.有什么办法吗 更有效率?? 这是我的代码:

Using Laravel Eloquent, i'm copying 7Million rows of data from one table on my old Mysql database and putting these rows on different tables on my new Mysql database. The problem is that it took almost one day to perform this and i need to re-perform this action for almost 80M of rows. I'm using chunk of 1000 data at a time. Is there any way to do it more efficiently?? Here my code:

   DB::connection('oldDataBase')->table('tableToCopy')->chunk(1000, function ($AllData){
        foreach ($AllData as $Data){
            DB::connection('newDataBase')->table('table1')->insert(
                [
                    column1 => $Data->columnToCopy,
                    etc..
                ]);


           DB::connection('newDataBase')->table('table2')->insert(
                [
                    column1 => $Data->columnToCopy,
                    etc..
                ]);
           DB::connection('newDataBase')->table('table3')->insert(
                [
                    column1 => $Data->columnToCopy,
                    etc..
                ]);
        }
    });

推荐答案

从laravel之类的SQL客户端进行此数据迁移不是一个好主意.

Doing this data migration from a SQL client like laravel is not a good idea.

如果我必须移动8000万行,请执行以下步骤:

If I had to move 80M rows, I'd take the following steps:

  1. 以CSV格式转储它们.
  2. 将CSV文件分成大约50万行的块.
  3. 在目标系统上创建表
  4. 禁用目标系统上的所有约束和索引.
  5. 使用LOAD DATA INFILE CSV文件一个接一个.为了获得最快的结果,应该从mysql mysqlimport运行在与MySQL服务器相同的计算机上运行的命令行客户端程序.
  6. 重新启用约束并建立索引.
  1. dump them in CSV form.
  2. split the CSV files into chunks of something like 500K rows each.
  3. create tables on the target system
  4. disable all constraints and indexes on the target system.
  5. use LOAD DATA INFILE to slurp up the CSV files one after the other. For fastest results this should be run from the mysql or mysqlimport command line client program running on the same machine as the MySQL server.
  6. re-enable the constraints and build the indexes.

我将在迁移日之前对此进行广泛的测试.我会做一些事情,例如加载CSV的第一个和最后一个块,并重新启用索引和约束.

I'd test this extensively before the migration day. I'd do things like load the first and last chunk of CSV and re-enable indexes and constraints.

在评论中建议了另一种可能性.使用mysqldump,然后通过mysql客户端程序加载生成的文件.

Another possibility was suggested in a comment. Use mysqldump, then load the resulting file via the mysql client program.

为此避免使用gui风格的mysql客户端程序.坚持使用命令行程序.尽管与这些GUI客户端一样好,但它们并不是为在数十兆.SQL文件中进行流传输而设计的.

Avoid the use of a gui-style mysql client program for this. Stick with the command-line programs. As good as those GUI clients are, they aren't engineered for streaming in multi-tens-of-megabyte .SQL files.

这篇关于在Laravel上将一百万行数据从一个数据库复制到另一个数据库的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆