CodeIgniter-如何在没有内存泄漏的情况下选择大表中的所有行 [英] CodeIgniter - How to SELECT all rows in a big table without memory leake
问题描述
很难理解标题中的需求.
It's kinda hard to undertsand my need in the title.
CodeIgniter正在一张照片中对800,000+行的表执行SELECT查询.
CodeIgniter is performing a SELECT query in a table of 800,000+ rows in one shot.
这会占用大量内存,但是在一台特定的服务器上,我会收到内存不足"的致命错误.
It takes a lot of memory, but in one specific server, I get a "Out of memory" fatal error.
出于性能目的,我想将选择分为2个选择,更具体地说,将50%的第一行,然后再剩下50%.
For performance purposes, I would like to seperate the select into 2 selects, and more specifically, the 50% first rows, and then the 50% left.
我重用这组数据之后再执行INSERT.
I reuse this set of data to perform an INSERT afterwise.
如何做到这一点而又不会丢失/遗忘任何一行?
How to do that without losing/forgetting any single row ?
推荐答案
除了类似的操作与性能问题高度相关的事实外,您还可以使用 unbuffered_row
.
Beside the fact that operations like that are highly connected to performance issues, you can use unbuffered_row
.
基本上,如果您的工作具有如此大的数据,则应使用已提供
unbuffered_row
并将其集成到已构建的在查询生成器中.
Basically, if you have a job with that large data - you should use
unbuffered_row
provided and integrated in the built in query builder.
在结果行部分的此处中有据可查.
its very well documented here in the result rows section.
例如:
$query = $this->db->select('*')->from('your_table')->get();
while($row = $query->unbuffered_row())
{
//do your job
}
这将避免您的内存问题.
This will avoid your memory problem.
这篇关于CodeIgniter-如何在没有内存泄漏的情况下选择大表中的所有行的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!