如何在Laravel上插入大数据? [英] How to insert big data on the laravel?

查看:237
本文介绍了如何在Laravel上插入大数据?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用laravel 5.6

I am using laravel 5.6

我插入大数据的脚本是这样的:

My script to insert big data is like this :

...
$insert_data = [];
foreach ($json['value'] as $value) {
    $posting_date = Carbon::parse($value['Posting_Date']);
    $posting_date = $posting_date->format('Y-m-d');
    $data = [
        'item_no'                   => $value['Item_No'],
        'entry_no'                  => $value['Entry_No'], 
        'document_no'               => $value['Document_No'],
        'posting_date'              => $posting_date,
        ....
    ];
    $insert_data[] = $data;
}
\DB::table('items_details')->insert($insert_data);

我试图用脚本插入100条记录,它可以正常工作.成功插入数据

I have tried to insert 100 record with the script, it works. It successfully insert data

但是,如果我尝试在脚本中插入50000条记录,它将变得非常缓慢.我已经等了大约10分钟,但没有成功.存在这样的错误:

But if I try to insert 50000 record with the script, it becomes very slow. I've waited about 10 minutes and it did not work. There exist error like this :

504 Gateway Time-out

我该如何解决这个问题?

How can I solve this problem?

推荐答案

在这种情况下,如果这是时间执行问题,那么块实际上不会为您提供帮助.我认为您尝试使用的批量插入无法处理该数量的数据,因此我看到2个选项:

As it was stated, chunks won't really help you in this case if it is a time execution problem. I think that bulk insert you are trying to use cannot handle that amount of data , so I see 2 options:

1-重新组织代码以正确使用块,这看起来像这样:

1 - Reorganise your code to properly use chunks, this will look something like this:

$insert_data = [];

foreach ($json['value'] as $value) {
    $posting_date = Carbon::parse($value['Posting_Date']);

    $posting_date = $posting_date->format('Y-m-d');

    $data = [
        'item_no'                   => $value['Item_No'],
        'entry_no'                  => $value['Entry_No'], 
        'document_no'               => $value['Document_No'],
        'posting_date'              => $posting_date,
        ....
    ];

    $insert_data[] = $data;
}

$insert_data = collect($insert_data); // Make a collection to use the chunk method

// it will chunk the dataset in smaller collections containing 500 values each. 
// Play with the value to get best result
$chunks = $insert_data->chunk(500);

foreach ($chunks as $chunk)
{
   \DB::table('items_details')->insert($chunk->toArray());
}

这样,您的批量插入内容将包含较少的数据,并且能够以相当快的方式进行处理.

This way your bulk insert will contain less data, and be able to process it in a rather quick way.

2-如果主机支持运行时重载,则可以在代码开始执行之前添加一条指令:

2 - In case your host supports runtime overloads, you can add a directive right before the code starts to execute :

ini_set('max_execution_time', 120 ) ; // time in seconds

$insert_data = [];

foreach ($json['value'] as $value)
{
   ...
}

要了解更多信息,请访问官方文档

To read more go to the official docs

这篇关于如何在Laravel上插入大数据?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆