如何在不受内存限制的情况下在php中读取大文件 [英] How to read big file in php without being memory limit

查看:195
本文介绍了如何在不受内存限制的情况下在php中读取大文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试逐行读取文件.问题是文件太大(超过500000行),我超出了内存限制.我想知道如何在不受内存限制的情况下读取文件.

I'm trying to read a file line by line. The problem is the file was too big(over 500000 line) and I reach out the memory limit. I wonder how to read the file without being memory limit.

我正在考虑多线程的解决方案(例如将文件分成较小的组(每组100000行)并在多线程中读取它),但是我不知道如何详细地做.请帮助我(对不起,英语不好.)

I'm thinking about the solution multi threads(like split the file into smaller group(100000 line per group) and read it in multi threads), but I don't know how to do it in detail. Please help me(Sorry for bad English).

这是我的代码

$fn = fopen("myfile.txt", "r");

while(!feof($fn)) {
    $result = fgets($fn);
    echo $result;
}

fclose($fn);

推荐答案

您可以使用生成器来处理内存使用情况.这只是用户在文档页面上写的一个示例:

You could use a generator to handle the memory usage. This is just an example written by a user on the documentation page:

function getLines($file)
{
    $f = fopen($file, 'r');

    try {
        while ($line = fgets($f)) {
            yield $line;
        }
    } finally {
        fclose($f);
    }
}

foreach (getLines("file.txt") as $n => $line) {
    // insert the line into db or do whatever you want with it.
}

生成器允许您编写使用foreach遍历一组数据的代码,而无需在内存中构建数组,这可能会导致您超出内存限制,或需要大量的处理时间才能生成.取而代之的是,您可以编写一个生成器函数,该函数与普通函数相同,不同之处在于,生成器可以返回所需的多次次数来提供要迭代的值,而不是返回一次.

A generator allows you to write code that uses foreach to iterate over a set of data without needing to build an array in memory, which may cause you to exceed a memory limit, or require a considerable amount of processing time to generate. Instead, you can write a generator function, which is the same as a normal function, except that instead of returning once, a generator can yield as many times as it needs to in order to provide the values to be iterated over.

这篇关于如何在不受内存限制的情况下在php中读取大文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆