mysqli批处理查询的建议 [英] Recommendation for mysqli batch queries

查看:98
本文介绍了mysqli批处理查询的建议的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的用例:

我有多个脚本以每秒几次插入的顺序插入到表中.我看到性能下降,所以我认为分批查询"并每分钟左右插入几百行会带来性能上的好处.

I have multiple scripts inserting into a table in the order of several inserts per second. I am seeing performance degradation, so I think there would be performance benefits in "batching queries" and inserting several hundred rows every minute or so.

问题:

我该如何使用mysqli做到这一点?我当前的代码使用包装器( pastebin ),如下所示:

How would I go about doing this using mysqli? My current code uses a wrapper (pastebin), and looks like:

$array = array();\\BIG ARRAY OF VALUES (more than 100k rows worth)
foreach($array AS $key => $value){
  $db -> q('INSERT INTO `player_items_attributes` (`column1`, `column2`, `column3`) VALUES (?, ?, ?)', 'iii', $value['test1'], $value['test2'], $value['test3']);
}

注释:

我看过使用事务,但是听起来那些事务仍然会命中服务器,而不是排队.我更喜欢使用包装器(随意建议一个功能与我当前提供的功能类似的包装器),但是如果不可能的话,我将尝试在所使用的包装器中构建建议.

I looked at using transactions, but it sounds like those would still hit the server, instead of queuing. I would prefer to use a wrapper (feel free to suggest one with similar functionality to what my current one offers), but if not possible I will try to build suggestions into the wrapper I use.

来源:

包装器来自此处

我正在尝试优化表速度,而不是脚本速度.该表有超过3500万行,并且有一些索引.

I am trying optimize table speed, rather than script speed. This table has more than 35million rows, and has a few indexes.

推荐答案

MySQL INSERT语法允许一个INSERT查询插入多个行,如下所示:

The MySQL INSERT syntax allows for one INSERT query to insert multiple rows, like this:

INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);

其中每组带括号的值代表表中的另一行.因此,通过处理数组,您可以在一个查询中创建多行.

where each set of parenthesised values represents another row in your table. So, by working down the array you could create multiple rows in one query.

有一个主要限制:查询的总大小不得超过配置的限制.对于10万行,您可能必须将其分解为250行,将100k查询减少到400行.您也许可以走得更远.

There is one major limitation: the total size of the query must not exceed the configured limit. For 100k rows you'd probably have to break this down into blocks of, say, 250 rows, reducing your 100k queries to 400. You might be able to go further.

我不会尝试编写代码-您必须编写一些代码并在您的环境中尝试.

I'm not going to attempt to code this - you'd have to code something and try it in your environment.

这是伪代码版本:

escape entire array // array_walk(), real_escape_string()

block_size = 250; // number of rows to insert per query
current_block = 0;
rows_array = [];


while (next-element <= number of rows) {

    create parenthesised set and push to rows_array  // implode()

    if (current_block == block_size) {
        implode rows_array and append to query 
        execute query
        set current_block = 0
        reset rows_array
        reset query
    }
    current_block++
    next_element++
}

if (there are any records left over) {
    implode rows_array and append to query 
    execute the query for the last block
}

我已经可以想到使用array_map()实现更快的实现-试试吧.

I can already think of a potentially faster implementation with array_map() - try it.

这篇关于mysqli批处理查询的建议的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆