编写非常大的CSV文件从PHP中的DB输出 [英] Writing a very large CSV file From DB output in PHP
问题描述
我有一个每秒收集的传感器数据的DB。客户希望能够以CSV格式下载12小时的数据块 - 这一切都已完成。
I have a DB of sensor data that is being collected every second. The client would like to be able to download 12hour chunks in CSV format - This is all done.
输出不是直接数据,需要在CSV之前处理可以创建(部分作为JSON存储在DB中) - 所以我不能只转储表。
The output is sadly not straight data and needs to be processed before the CSV can be created (parts are stored as JSON in the DB) - so I cant just dump the table.
因此,为了减少负载,我想第一次
So, to reduce load, I figured that the first time the file is downloaded, I would cache it to disk, then any more requests just download that file.
如果我不尝试写它(使用file_put_contents,FILE_APPEND),并且下载文件,我会将其缓存到磁盘,
If I dont try to write it (using file_put_contents, FILE_APPEND), and just echo every line it is fine, but writing it, even if I give the script 512M it runs out of memory.
所以这个工作
while($stmt->fetch()){
//processing code
$content = //CSV formatting
echo $content;
}
这不
$ b
This does not
while($stmt->fetch()){
//processing code
$content = //CSV formatting
file_put_contents($pathToFile, $content, FILE_APPEND);
}
似乎甚至认为我在每一行调用file_put_contents,
It seems like even thought I am calling file_put_contents at every line, it is storing it all to memory.
任何建议?
推荐答案
是file_put_contents试图一次转储整个事情。相反,你应该循环通过你的格式化,并使用fopen,fwrite,fclose。
The problem is that file_put_contents is trying to dump the entire thing at once. Instead you should loop through in your formatting and use fopen, fwrite, fclose.
while($stmt->fetch()){
//processing code
$content[] = //CSV formatting
$file = fopen($pathToFile, a);
foreach($content as $line)
{
fwrite($file, $line);
}
fclose($file);
}
这将限制尝试在任何数据中抛出的数据量给定时间。
This will limit the amount of data trying to be tossed around in data at any given time.
这篇关于编写非常大的CSV文件从PHP中的DB输出的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!