我是否已达到浏览器中 JavaScript 可以处理的对象大小的限制? [英] Have I reached the limits of the size of objects JavaScript in my browser can handle?

查看:25
本文介绍了我是否已达到浏览器中 JavaScript 可以处理的对象大小的限制?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在 HTML 的 <script> 标签中嵌入了一个大数组,就像这样(没什么奇怪的):

在这个特定的例子中,数组有 210,000 个元素.这远低于理论最大值 231 - 4 个数量级.这是有趣的部分:如果我将数组的 JS 源代码保存到文件中,则该文件大于 44 兆字节(准确地说是 46,573,399 字节).

想自己看的话可以 以及 古法,我能够让它工作!问题似乎只是在解析源代码,而不是在内存中实际使用它.

总结一下 Juan 回答的评论泥潭:我必须将我的大数组拆分成一系列较小的数组,然后 Array#concat() 它们,但这还不够.我不得不将它们放入单独的var语句中.像这样:

var arr0 = [...];var arr1 = [...];var arr2 = [...];/* ... */var bigArray = arr0.concat(arr1, arr2, ...);

感谢所有为解决这个问题做出贡献的人:谢谢.第一轮在我身上!

<小时>

*除了显而易见的:向浏览器发送更少的数据

解决方案

这是我要尝试的:你说这是一个 44MB 的文件.这肯定需要超过 44MB 的内存,我猜这需要超过 44MB 的 RAM,可能是半个演出.能不能把数据截下来直到浏览器不死机,看看浏览器用了多少内存?

即使是只在服务器上运行的应用程序也可以很好地服务于不读取 44MB 的文件并将其保存在内存中.说了这么多,我相信浏览器应该能够处理它,所以让我运行一些测试.

(使用 Windows 7,4GB 内存)

第一次测试我把数组切成两半,没有问题,使用了 80MB,没有崩溃

第二次测试我把数组拆分成两个独立的数组,但是还是包含了所有的数据,使用了160Mb,没有崩溃

第三次测试由于 Firefox 说它用完了堆栈,问题可能是它无法立即解析数组.我创建了两个单独的数组,arr1, arr2 然后做了 arr3 = arr1.concat(arr2);它运行良好,只使用了稍微多一点的内存,大约 165MB.

第四次测试我正在创建 7 个这样的数组(每个 22MB)并将它们连接起来以测试浏览器限制.页面完成加载大约需要 10 秒.内存上升到 1.3GB,然后又下降到 500MB.所以是的 chrome 可以处理它.它只是无法一次全部解析,因为它使用了某种递归,控制台的错误消息可以看出这一点.

答案 创建单独的数组(每个小于 20MB),然后将它们连接起来.每个数组都应该在它自己的 var 语句上,而不是用一个 var 做多个声明.

我仍然会考虑只获取必要的部分,这可能会使浏览器变得迟钝.不过,如果是内部任务,应该没问题.

最后一点:您没有达到最大内存级别,只是最大解析级别.

I'm embedding a large array in <script> tags in my HTML, like this (nothing surprising):

<script>
    var largeArray = [/* lots of stuff in here */];
</script>

In this particular example, the array has 210,000 elements. That's well below the theoretical maximum of 231 - by 4 orders of magnitude. Here's the fun part: if I save JS source for the array to a file, that file is >44 megabytes (46,573,399 bytes, to be exact).

If you want to see for yourself, you can download it from GitHub. (All the data in there is canned, so much of it is repeated. This will not be the case in production.)

Now, I'm really not concerned about serving that much data. My server gzips its responses, so it really doesn't take all that long to get the data over the wire. However, there is a really nasty tendency for the page, once loaded, to crash the browser. I'm not testing at all in IE (this is an internal tool). My primary targets are Chrome 8 and Firefox 3.6.

In Firefox, I can see a reasonably useful error in the console:

Error: script stack space quota is exhausted

In Chrome, I simply get the sad-tab page:

Cut to the chase, already

  • Is this really too much data for our modern, "high-performance" browsers to handle?
  • Is there anything I can do* to gracefully handle this much data?

Incidentally, I was able to get this to work (read: not crash the tab) on-and-off in Chrome. I really thought that Chrome, at least, was made of tougher stuff, but apparently I was wrong...


Edit 1

@Crayon: I wasn't looking to justify why I'd like to dump this much data into the browser at once. Short version: either I solve this one (admittedly not-that-easy) problem, or I have to solve a whole slew of other problems. I'm opting for the simpler approach for now.

@various: right now, I'm not especially looking for ways to actually reduce the number of elements in the array. I know I could implement Ajax paging or what-have-you, but that introduces its own set of problems for me in other regards.

@Phrogz: each element looks something like this:

{dateTime:new Date(1296176400000),
 terminalId:'terminal999',
 'General___BuildVersion':'10.05a_V110119_Beta',
 'SSM___ExtId':26680,
 'MD_CDMA_NETLOADER_NO_BCAST___Valid':'false',
 'MD_CDMA_NETLOADER_NO_BCAST___PngAttempt':0}

@Will: but I have a computer with a 4-core processor, 6 gigabytes of RAM, over half a terabyte of disk space ...and I'm not even asking for the browser to do this quickly - I'm just asking for it to work at all!


Edit 2

Mission accomplished!

With the spot-on suggestions from Juan as well as Guffa, I was able to get this to work! It would appear that the problem was just in parsing the source code, not actually working with it in memory.

To summarize the comment quagmire on Juan's answer: I had to split up my big array into a series of smaller ones, and then Array#concat() them, but that wasn't enough. I also had to put them into separate var statements. Like this:

var arr0 = [...];
var arr1 = [...];
var arr2 = [...];
/* ... */
var bigArray = arr0.concat(arr1, arr2, ...);

To everyone who contributed to solving this: thank you. The first round is on me!


*other than the obvious: sending less data to the browser

解决方案

Here's what I would try: you said it's a 44MB file. That surely takes more than 44MB of memory, I'm guessing this takes much over 44MB of RAM, maybe half a gig. Could you just cut down the data until the browser doesn't crash and see how much memory the browser uses?

Even apps that run only on the server would be well served to not read a 44MB file and keep it in memory. Having said all that, I believe the browser should be able to handle it, so let me run some tests.

(Using Windows 7, 4GB of memory)

First Test I cut the array in half and there were no problems, uses 80MB, no crash

Second Test I split the array into two separate arrays, but still contains all the data, uses 160Mb, no crash

Third Test Since Firefox said it ran out of stack, the problem is probably that it can't parse the array at once. I created two separate arrays, arr1, arr2 then did arr3 = arr1.concat(arr2); It ran fine and uses only slightly more memory, around 165MB.

Fourth Test I am creating 7 of those arrays (22MB each) and concatting them to test browser limits. It takes about 10 seconds for the page to finish loading. Memory goes up to 1.3GB, then it goes back down to 500MB. So yeah chrome can handle it. It just can't parse it all at once because it uses some kind of recursion as can be noticed by the console's error message.

Answer Create separate arrays (less than 20MB each) and then concat them. Each array should be on its own var statement, instead of doing multiple declarations with a single var.

I would still consider fetching only the necessary part, it may make the browser sluggish. however, if it's an internal task, this should be fine.

Last point: You're not at maximum memory levels, just max parsing levels.

这篇关于我是否已达到浏览器中 JavaScript 可以处理的对象大小的限制?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆