由于内存限制,串行json_decode [英] Serial json_decode due to memory limit
问题描述
我有一个较大的json文件(7.3MB),我尝试对其进行json_decode,但由于内存限制而失败(致命错误:允许的134217728字节的内存大小已用尽).有没有一种方法可以串行解码json文件,一次只有一个对象/节点?
I have a large json file (7.3MB) that I try to json_decode and it fails due to memory limit (Fatal error: Allowed memory size of 134217728 bytes exhausted). Is there a way to decode json file serially, with one object/node at a time?
推荐答案
我认为从理论上讲,您可以编写一些逻辑来解析字符串的开头和结尾的字符,从而在构建对象时迭代地减小内存中的字符串大小/array表示形式,但这会带来严重的痛苦.
I suppose in theory you could write some logic to parse characters off the beginning and ending of a string, iteratively reducing the string size in memory while building up the object/array representation, but that would be a serious pain.
为什么不仅仅增加内存限制,或者为什么这是需要频繁解码的JSON(即对Web应用程序的每个请求),您应该考虑将其分解为更多的使用组件.
Why not just increase your memory limits, or if this is some JSON that needs to be decoded frequently (i.e. with each request to a web application) you ought to consider breaking it apart into more usage components.
这篇关于由于内存限制,串行json_decode的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!