批量php的fgetcsv [英] Batching php's fgetcsv

查看:234
本文介绍了批量php的fgetcsv的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个相当大的csv文件(至少对于网络),我没有控制。它有大约100k行,并且只会增长更大。

I have a fairly large csv file (at least for the web) that I don't have control of. It has about 100k rows in it, and will only grow larger.

我使用Drupal模块Feed根据这些数据创建节点,并且他们的解析器批量解析50行的组。但是,它们的解析器不会正确处理引号,并且无法解析大约60%的csv文件。 fgetcsv的工作,但不是批处理的事情,只要我能告诉。

I'm using the Drupal Module Feeds to create nodes based on this data, and their parser batches the parsing in groups of 50 lines. However, their parser doesn't handle quotation marks properly, and fails to parse about 60% of the csv file. fgetcsv works but doesn't batch things as far as I can tell.

当尝试用fgetcsv读取整个文件,PHP最终耗尽内存。因此,我想能够将事情分成更小的块。这是可能吗?

While trying to read the entire file with fgetcsv, PHP eventually runs out of memory. Therefore I would like to be able to break things up into smaller chunks. Is this possible?

推荐答案

fgetcsv()在一个时间从给定的文件指针。如果PHP内存不足,也许你试图一次解析整个文件,把它放到一个巨大的数组中。

fgetcsv() works by reading one line at a time from a given file pointer. If PHP is running out of memory, perhaps you are trying to parse the whole file at once, putting it all into a giant array. The solution would be to process it line by line without storing it in a big array.

要直接回答批处理问题,请阅读 行,然后使用 ftell()在文件中找到您结束的位置。记下这一点,然后你可以通过在 fgetcsv()之前调用 fseek() code>。

To answer the batching question more directly, read n lines from the file, then use ftell() to find the location in the file where you ended. Make a note of this point, and then you can return to it at some point in the future by calling fseek() before fgetcsv().

这篇关于批量php的fgetcsv的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆