PHP性能不佳。随着大文件内存爆炸!我该如何重构? [英] Bad performance function in PHP. With large files memory blows up! How can I refactor?

查看:62
本文介绍了PHP性能不佳。随着大文件内存爆炸!我该如何重构?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个从文件中删除行的功能。我正在处理大文件(超过100Mb)。我拥有256MB的PHP内存,但用100MB的CSV文件处理不带线条的功能。



该功能必须做的是:



原来我的CSV是这样的:


Copyright(c)2007 MaxMind LLC 。所有
权利保留。
locId,国家,地区,城市,邮政编码,纬度,经度,metroCode,areaCode
1,O1,,,,0.0000,0.0000 ,,
2 ,AP,,,,35.0000,105.0000 ,,
3,EU,,,,47.0000,8.0000 ,,
4, AD,,,,42.5000,1.5000 ,,
5,AE,,,,24.0000,54.0000 ,,
6, ,,,,33.0000,65.0000 ,,
7,AG,,,,17.0500,-61.8000 ,,
8,AI, ,,,18.2500,-63.1667 ,,
9,AL,,,,41.0000,20.0000 ,,



当我将CSV文件传递给此函数时,我得到了:

lockquote

locId,country,地区,城市,邮政编码,纬度,经度,metroCode,areaCode
1,O1,,,,0.0000,0.0000 ,,
2,AP,, ,,35.0000,105.0000 ,,
3,EU,,,,47.0000,8.0000 ,,
4,AD,, ,,42.5000,1.5000 ,,
5,AE,,,,24.0000,54.0000 ,,
6,AF,,, ,33.0000,65.0000 ,,
7,AG,,,,17.0500,-61.8000 ,,
8,AI,,, ,18.2500,-63.1667 ,,
9,AL,,,,4 1.0000,20.0000 ,,

它只会删除第一行,仅此而已。问题是这个函数的性能与大文件,它吹起了内存。

该函数是:

  public function deleteLine($ line_no,$ csvFileName){

//这个函数从文件中剥离一个特定的行
//如果一行被剥离,函数返回True else else
//
// eg
// deleteLine(-1,xyz.csv); // strip last line
// deleteLine(1,xyz.csv); // strip first line

// Assigna o nome do ficheiro
$ filename = $ csvFileName;

$ strip_return = FALSE;

$ data = file($ filename);
$ pipe = fopen($ filename,'w');
$ size = count($ data);

if($ line_no == - 1)$ skip = $ size-1;
else $ skip = $ line_no-1; $
$ b($ line = 0; $ line< $ size; $ line ++)
if($ line!= $ skip)
fputs($ pipe,$ data [$线]);
else
$ strip_return = TRUE;

返回$ strip_return;



$ b $ p
$ b

有可能重构此函数,以防止256MB PHP内存



给我一​​些线索。



祝好,

解决方案

爆炸的问题在于将整个文件放在内存中的文件函数。为了克服这个问题,你需要逐行读取文件,将除了要删除的行全部写入临时文件,最后重命名临时文件。

  public function deleteLine($ line_no,$ csvFileName){

//在当前工作目录中获取临时文件名..您可以使用
//任何其他目录说/ tmp
$ tmpFileName = tempnam(。,csv);

$ strip_return = FALSE;

//打开输入文件进行阅读。
$ readFD = fopen($ csvFileName,'r');

//写入临时文件。
$ writeFD​​ = fopen($ tmpFileName,'w');

//检查fopen错误。

if($ line_no == - 1){
$ skip = $ size-1;
} else {
$ skip = $ line_no-1;
}

$ line = 0;

//从输入文件逐行读取行。
//写入除要删除的行外的所有行。
while(($ buffer = fgets($ readFD))!== false){
if($ line!= $ skip)
fputs($ writeFD​​,$ buffer);
else
$ strip_return = TRUE;
$ line ++;
}

//将临时文件重命名为输入文件。
重命名($ tmpFileName,$ csvFileName);

返回$ strip_return;
}


I have a function that strips out lines from files. I'm handling with large files(more than 100Mb). I have the PHP Memory with 256MB but the function that handles with the strip out of lines blows up with a 100MB CSV File.

What the function must do is this:

Originally I have the CSV like:

Copyright (c) 2007 MaxMind LLC. All Rights Reserved. locId,country,region,city,postalCode,latitude,longitude,metroCode,areaCode 1,"O1","","","",0.0000,0.0000,, 2,"AP","","","",35.0000,105.0000,, 3,"EU","","","",47.0000,8.0000,, 4,"AD","","","",42.5000,1.5000,, 5,"AE","","","",24.0000,54.0000,, 6,"AF","","","",33.0000,65.0000,, 7,"AG","","","",17.0500,-61.8000,, 8,"AI","","","",18.2500,-63.1667,, 9,"AL","","","",41.0000,20.0000,,

When I pass the CSV file to this function I got:

locId,country,region,city,postalCode,latitude,longitude,metroCode,areaCode 1,"O1","","","",0.0000,0.0000,, 2,"AP","","","",35.0000,105.0000,, 3,"EU","","","",47.0000,8.0000,, 4,"AD","","","",42.5000,1.5000,, 5,"AE","","","",24.0000,54.0000,, 6,"AF","","","",33.0000,65.0000,, 7,"AG","","","",17.0500,-61.8000,, 8,"AI","","","",18.2500,-63.1667,, 9,"AL","","","",41.0000,20.0000,,

It only strips out the first line, nothing more. The problem is the performance of this function with large files, it blows up the memory.

The function is:

 public function deleteLine($line_no, $csvFileName) {

  // this function strips a specific line from a file
  // if a line is stripped, functions returns True else false
  //
  // e.g.
  // deleteLine(-1, xyz.csv); // strip last line
  // deleteLine(1, xyz.csv); // strip first line

  // Assigna o nome do ficheiro
  $filename = $csvFileName;

  $strip_return=FALSE;

  $data=file($filename);
  $pipe=fopen($filename,'w');
  $size=count($data);

  if($line_no==-1) $skip=$size-1;
  else $skip=$line_no-1;

  for($line=0;$line<$size;$line++)
   if($line!=$skip)
    fputs($pipe,$data[$line]);
   else
    $strip_return=TRUE;

  return $strip_return;
 }

It is possible to refactor this function to not blow up with the 256MB PHP Memory?

Give me some clues.

Best Regards,

解决方案

The problem for your blowout is the file function that brings the entire file in memory. To overcome this you need to read the file line by line, write all but the line to be deleted to a temporary file and finally rename the temporary file.

public function deleteLine($line_no, $csvFileName) {

        // get a temp file name in current working directory..you can use
        // any other directory say /tmp
        $tmpFileName = tempnam(".", "csv");

        $strip_return=FALSE;

        // open input file for reading.
        $readFD=fopen($csvFileName,'r');

        // temp file for writing.
        $writeFD=fopen($tmpFileName,'w');

        // check for fopen errors.

        if($line_no==-1) {
                $skip=$size-1;
        } else {
                $skip=$line_no-1;
        }

        $line = 0;

        // read lines from input file one by one.
        // write all lines except the line to be deleted.
        while (($buffer = fgets($readFD)) !== false) {
                if($line!=$skip)
                        fputs($writeFD,$buffer);
                else
                        $strip_return=TRUE;
                $line++;
        }

        // rename temp file to input file.    
        rename($tmpFileName,$csvFileName);

        return $strip_return;
}

这篇关于PHP性能不佳。随着大文件内存爆炸!我该如何重构?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆