如何在 laravel 上插入大数据? [英] How to insert big data on the laravel?

查看:20
本文介绍了如何在 laravel 上插入大数据?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用 laravel 5.6

I am using laravel 5.6

我插入大数据的脚本是这样的:

My script to insert big data is like this :

...
$insert_data = [];
foreach ($json['value'] as $value) {
    $posting_date = Carbon::parse($value['Posting_Date']);
    $posting_date = $posting_date->format('Y-m-d');
    $data = [
        'item_no'                   => $value['Item_No'],
        'entry_no'                  => $value['Entry_No'], 
        'document_no'               => $value['Document_No'],
        'posting_date'              => $posting_date,
        ....
    ];
    $insert_data[] = $data;
}
DB::table('items_details')->insert($insert_data);

我尝试在脚本中插入 100 条记录,它可以工作.成功插入数据

I have tried to insert 100 record with the script, it works. It successfully insert data

但是如果我尝试用脚本插入 50000 条记录,它会变得非常慢.我已经等了大约 10 分钟,但它没有工作.存在这样的错误:

But if I try to insert 50000 record with the script, it becomes very slow. I've waited about 10 minutes and it did not work. There exist error like this :

504 Gateway Time-out

我该如何解决这个问题?

How can I solve this problem?

推荐答案

如前所述,如果是时间执行问题,在这种情况下,块不会真正帮助您.我认为您尝试使用的批量插入无法处理那么多数据,所以我看到了 2 个选项:

As it was stated, chunks won't really help you in this case if it is a time execution problem. I think that bulk insert you are trying to use cannot handle that amount of data , so I see 2 options:

1 - 重新组织你的代码以正确使用块,这看起来像这样:

1 - Reorganise your code to properly use chunks, this will look something like this:

$insert_data = [];

foreach ($json['value'] as $value) {
    $posting_date = Carbon::parse($value['Posting_Date']);

    $posting_date = $posting_date->format('Y-m-d');

    $data = [
        'item_no'                   => $value['Item_No'],
        'entry_no'                  => $value['Entry_No'], 
        'document_no'               => $value['Document_No'],
        'posting_date'              => $posting_date,
        ....
    ];

    $insert_data[] = $data;
}

$insert_data = collect($insert_data); // Make a collection to use the chunk method

// it will chunk the dataset in smaller collections containing 500 values each. 
// Play with the value to get best result
$chunks = $insert_data->chunk(500);

foreach ($chunks as $chunk)
{
   DB::table('items_details')->insert($chunk->toArray());
}

这样您的批量插入将包含更少的数据,并且能够以相当快速的方式处理它.

This way your bulk insert will contain less data, and be able to process it in a rather quick way.

2 - 如果您的主机支持运行时重载,您可以在代码开始执行之前添加一个指令:

2 - In case your host supports runtime overloads, you can add a directive right before the code starts to execute :

ini_set('max_execution_time', 120 ) ; // time in seconds

$insert_data = [];

foreach ($json['value'] as $value)
{
   ...
}

要了解更多信息,请访问官方 docs

To read more go to the official docs

这篇关于如何在 laravel 上插入大数据?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆