PHP file_get_contents是否有限制? [英] Is there a limit on PHP file_get_contents?
问题描述
我正在尝试使用php file_get_contents读取一个大文件(10M)
I am trying to read a large file (10M) using php file_get_contents
$file = 'http://www.remoteserver.com/test.txt';
$data = file_get_contents( $file );
var_dump ( $data );
它转回了
string(32720)
,然后仅显示文件一部分的输出. file_get_contents的某个地方有限制吗?我尝试做ini_set('memory_limit','512M'),但这没用.
and then the output with only showing part of the file. Is there a limit somewhere of file_get_contents? I tried doing ini_set('memory_limit', '512M'), but that did not work.
** 忘记提及 **这是一个远程文件.
** forgot to mention ** it's a remote file.
已解决的问题 ::硬盘空间不足.解决了该问题,现在一切正常.
PROBLEM RESOLVED:: Out of HDD space. Fixed that and now everything works.
推荐答案
假设您要加载的文件的内容在逻辑上由换行符分隔(例如,不是二进制文件),那么您可能最好不读行按行.
Assuming the contents of the file you want to load are logically separated by line breaks (eg: not a binary file), then you might be better off reading line by line.
$fp = fopen($path_to_file, "r");
$fileLines = array();
while (!feof($fp)){
array_push(fgets($fp),$fileContents);
}
fclose($$fp);
如果您确实需要将文件打包成一个块",则可以始终implode()
(选择换行符)将数组恢复为单个字符串.
You could always implode()
(with your choice of line break character) the array back to a single string if you really need the file in one "chunk".
参考-
这篇关于PHP file_get_contents是否有限制?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!