从MySQL提取大型数据集时PHP中的内存泄漏 [英] Memory leak in PHP when fetching large dataset from MySQL

查看:75
本文介绍了从MySQL提取大型数据集时PHP中的内存泄漏的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当我为大约60,000条记录的用户表执行以下代码时:

When I execute the following code for a user table of about 60,000 records:

mysql_connect("localhost", "root", "");
mysql_select_db("test");

$result = mysql_query("select * from users");

while ($row = mysql_fetch_object($result)) {
  echo(convert(memory_get_usage(true))."\n");
}


function convert($size) {
  $unit=array('b','kb','mb','gb','tb','pb');
  return @round($size/pow(1024,($i=floor(log($size,1024)))),2).' '.$unit[$i];
}

我收到以下错误:

PHP Fatal error:  Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes)

是否有关于如何避免脚本在每次循环中占用额外内存的想法?在我的实际代码中,我试图为大型数据集提供CSV下载,并进行一些PHP预处理.

Any thoughts on how to avoid having the script take up additional memory with each pass through the loop? In my actual code I'm trying to provide a CSV download for a large dataset, with a little PHP pre-processing.

请不要建议增加PHP的内存限制-这是一个坏主意,更重要的是,仍然会限制使用此技术可以处理多大数据集.

Please don't recommend increasing PHP's memory limit--it's a bad idea and, more importantly, will still create an upward bound on how large a dataset can be processed with this technique.

推荐答案

mysql_query 缓冲整个结果集进入PHP内存.这很方便并且通常非常快,但是您遇到了一个缺点.

mysql_query buffers the entire result set into php memory. This is convenient and generally very fast, but you're experiencing a drawback to it.

mysql_unbuffered_query ()存在.它不会一次获取全部结果集.当您从结果集中获取行时,它一次抓取一些小片段.

mysql_unbuffered_query() exists. It doesn't grab the entire result set all at once. It grabs little pieces at a time when you fetch rows from the result set.

这篇关于从MySQL提取大型数据集时PHP中的内存泄漏的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆