PHP | json_decode巨大的json文件 [英] PHP | json_decode huge json file

查看:109
本文介绍了PHP | json_decode巨大的json文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正尝试解码大型json文件222mb文件.

im trying to decode large json file 222mb file.

我了解我无法通过使用file_get_contents()直接读取整个文件并解码整个字符串来直接使用json_decode,因为它会占用大量内存,并且不会返回任何内容(到目前为止,这是这样做的.)

i understand i can not use json_decode directly by using file_get_contents() to read whole file and decode whole string, as it would consume alot of memory and would return nothing(this is what its doing so far.)

所以我去尝试图书馆, 我最近尝试过的一个是 JSONParser . 它所做的是在json数组中一一读取对象.

so i went to try out libraries, The one i tried recently is JSONParser. what it does reads the objects one by one in json array.

但是由于那里没有文档,我想在这里问是否有人使用过此库.

but due to lack of documentation there, i want to ask here if anyone has worked with this library.

这是来自github的示例测试代码

this is the example test code from github

// initialise the parser object
$parser = new JSONParser();

// sets the callbacks
$parser->setArrayHandlers('arrayStart', 'arrayEnd');
$parser->setObjectHandlers('objStart', 'objEnd');
$parser->setPropertyHandler('property');
$parser->setScalarHandler('scalar');
/*
echo "Parsing top level object document...\n";
// parse the document
$parser->parseDocument(__DIR__ . '/data.json');*/

$parser->initialise();

//echo "Parsing top level array document...\n";
// parse the top level array

$parser->parseDocument(__DIR__ . '/array.json');

如何使用循环并将对象保存在php变量中,我们可以轻松地将其解码为php数组以供进一步使用.

how to use a loop and save the object in php variable that we can easily decode to php array for our further use.

这将花费一些时间,因为它将对json数组的所有对象一个接一个地执行此操作,但是问题在于如何使用该库在它上面循环,或者没有这样的选项.

this would take some time as it would be doing this one by one for all objects of json array, but question stands how to loop over it using this library, or isn't there such option.

或者还有其他更好的选择或库可以用于此工作吗?

Or are any other better options or libraries for this sorta job?

推荐答案

另一种替代方法是使用 halaxa/json -机器.

Another alternative is to use halaxa/json-machine.

通过JSON进行迭代的用法与json_decode的用法相同,但是无论文件有多大,它都不会达到内存限制.无需执行任何操作,只需执行foreach.

Usage in case of iteration over JSON is the same as in case of json_decode, but it will not hit memory limit no matter how big your file is. No need to implement anything, just your foreach.

示例:

$users = \JsonMachine\JsonMachine::fromFile('500MB-users.json');

foreach ($users as $id => $user) {
    // process $user as usual
}

有关更多详细信息,请参见github自述文件.

See github readme for more details.

这篇关于PHP | json_decode巨大的json文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆