如何优化执行的时候与PDO插入数据? [英] How to optimize the times of execute when to insert data with pdo?
问题描述
有是包含500K子一维数组一个巨大的二维数组,每个子数组包含5个元素。
现在,它是我的工作,所有的数据插入到一个SQLite数据库。
There is a huge two dimensional array which contain 500k sub one dimension arrays, every sub array contain 5 elements. Now it is my job to insert all the data into a sqlite database.
function insert_data($array){
Global $db;
$dbh=new PDO("sqlite:{$db}");
$sql = "INSERT INTO quote (f1,f2,f3,f4,f5) VALUES (?,?,?,?,?)";
$query = $dbh->prepare($sql);
foreach($array as $item){
$query->execute(array_values($item));
}
$dbh=null;
}
我想,以优化执行行动将用于50万次执行的数据插入过程中,如何让它执行只是一个时间?
I want to optimize the data insert process that the execute action will be executed for 500k times,how to make it executed just one time?
推荐答案
我们的想法是运行的事务为每个插入prevent,因为这将是非常缓慢的确实。所以,刚开始并提交事务,说每10,000记录。
The idea is to prevent running transactions for each insert, because it will be very slow indeed. So just start and commit the transaction, say for every 10k records.
$dbh->beginTransaction();
$counter = 0;
foreach($array as $item) {
$query->execute(array_values($item));
if ($counter++ % 10000 == 0) {
$dbh->commit();
$dbh->beginTransaction();
}
}
$dbh->commit();
另一种解决方案,你可以在一个CSV文件移动数组,然后只是导入。
Another solution, you can move an array in a csv file and then just import it.
这篇关于如何优化执行的时候与PDO插入数据?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!