PHP PDO使用bindParam插入多个(10000+)相同的行。好的做法? [英] PHP PDO insert multiple (10000+) same rows using bindParam. Good practice?

查看:232
本文介绍了PHP PDO使用bindParam插入多个(10000+)相同的行。好的做法?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要能够从表单请求插入10'000 +类似的行一次。目前我已经做了一行准备语句循环10000次,其中我重新绑定每个变量的Param

  ($ i = 0; $ i  {
... 9其他bindParam
$ insertG-> bindParam(':v1' $ v1,PDO :: PARAM_STR);
$ insertG-> bindParam(':v2',$ v2,PDO :: PARAM_INT);
$ insertG-> execute();
}

这需要接近30秒才能实现,当然不是一个好的做法。今天是10点,但明天可能是100点。



如果我在一个查询中插入多个行,使用(v1,v2),(v1,v2)... 我需要绑定每个值到一个新的参数,因此我相信我需要在一个查询中接近100'000 bindedParam。如果它是UTF-8和我计数大约2字节(我知道它可以达到4)每个字符我的查询将是大约10到20 MB和mysql服务器在另一台机器上。说这个我很惊讶,只需要30秒我的设计不好的请求成功。



有没有办法只发送一行,并告诉mysql服务器复制最后一行10'000次?



编辑部分解决方案



遵循Bill Karwin和Zsolt Szilagy的建议。我设法下降到5-6秒,对于远程mysql服务器的10'000插入以下调整:

  $ dataBase-> beginTransaction(); 

$ insertG = $ dataBase-> prepare('INSERT INTO G ...)
...
10 *所有种类的bindParam

for($ i = 0; $ i <10000; ++ $ i)
{
$ hashKey = sha1(uniqid()。$ i); // $ hashKey是一个绑定的参数
$ insertG-> execute();
}
$ dataBase-> commit();


解决方案

你不需要bindParam迭代的循环。 bindParam()使变量$ v1,$ v2等通过引用绑定,所以你需要做的是更改这些变量的值,然后重新执行查询。这可以减少开销。



此外,您可以避免每次通过循环调用intval()。只需在循环之前确保$ cloneCount被强制转换为整数一次。这是一个非常小的改进,但它是一个好的做法。

  $ cloneCount =(int) 

... 9其他bindParam
$ insertG-> bindParam(':v1',$ v1,PDO :: PARAM_STR);
$ insertG-> bindParam(':v2',$ v2,PDO :: PARAM_INT);

for($ i = 0; $ i <$ cloneCount; $ i ++)
{
$ v1 = / * something * /
$ v2 = * something * /
$ insertG-> execute();
}

您也应该避免自动提交。通过启动显式事务,减少MySQL每个语句执行的事务开销>,插入几千行,然后提交事务。



但是,加速批量INSERT的成千上万个类似行到单个表的最好的方法是使用< a href =http://dev.mysql.com/doc/refman/5.6/en/load-data.html =noreferrer> LOAD DATA LOCAL INFILE 而不是INSERT。这比INSERT逐行运行速度快10-20倍,即使你使用参数,事务,多行插入和任何其他的技巧,你可以想到。



即使您必须使用PHP将数据写入.CSV文件到磁盘,然后对该文件使用LOAD DATA LOCAL INFILE,它仍然会快得多。 / p>

另请参阅 INSERT语句的速度在MySQL手册中有更多提示。


I need to be able to insert from a form request 10'000 + similar row at once. Currently I've done it with a one row prepared statement looped 10'000 times where I re-bindParam each var.

for ($i=0; $i < intval($cloneCount); $i++) 
{
    ... 9 other bindParam
    $insertG->bindParam(':v1', $v1, PDO::PARAM_STR);
    $insertG->bindParam(':v2', $v2, PDO::PARAM_INT);
    $insertG->execute();
}

It takes nearly 30 seconds to achieve and is certainly not a good practice. It's 10'000 today but could be 100'000 tomorrow.

If I insert multiples row in one query with (v1,v2),(v1,v2)... I need to bind each value to a new param thus I believe I would need to have nearly 100'000 bindedParam in one query. If it's UTF-8 and I count around 2 Bytes (I know it can up to 4) per char my Query will be around 10 to 20 MB and the mysql server is on another machine. Saying this I'm surprised it took only 30 sec for my poorly designed request to succeed.

Is there a way to send only one line and tell the mysql server to replicate the last row 10'000 times?

EDIT PARTIAL SOLUTION

Following Bill Karwin and Zsolt Szilagy advices. I managed to get down to 5-6 seconds with the following tweaks for a 10'000 insert to a remote mysql server:

$dataBase->beginTransaction();

$insertG = $dataBase->prepare('INSERT INTO G...)
...
10 * bindParam of all kinds

for ($i=0; $i < 10000; ++$i) 
{ 
    $hashKey = sha1(uniqid().$i); //$hashKey is a binded param
    $insertG->execute();
}
$dataBase->commit();

解决方案

You don't need to bindParam() during every iteration of the loop. The bindParam() causes the variables $v1, $v2, etc. to be bound by reference, so all you need to do is change the values of these variables and then re-execute the query. That could cut down on the overhead.

Also you can avoid calling intval() every time through the loop. Just make sure $cloneCount is coerced to integer once, before the loop. That's a very minor improvement, but it's good practice.

$cloneCount = (int) $cloneCount;

... 9 other bindParam
$insertG->bindParam(':v1', $v1, PDO::PARAM_STR);
$insertG->bindParam(':v2', $v2, PDO::PARAM_INT);

for ($i=0; $i < $cloneCount; $i++) 
{
  $v1 = /* something */
  $v2 = /* something */
  $insertG->execute();
}

You should also avoid autocommit. Reduce the transaction overhead of MySQL per statement execution by starting an explicit transaction, inserting several thousand rows, and then committing the transaction.

But the best way to speed up bulk INSERT of thousands of similar rows to a single table is to use LOAD DATA LOCAL INFILE instead of INSERT. This runs 10-20x faster than INSERT row by row, even if you use parameters, transactions, multi-row insert, and any other trick you can think of.

Even if you have to use PHP to write your data into a .CSV file to disk and then use LOAD DATA LOCAL INFILE on that file, it's still much faster.

See also Speed of INSERT Statements in the MySQL manual for more tips.

这篇关于PHP PDO使用bindParam插入多个(10000+)相同的行。好的做法?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆