如何处理大JSON文件的加载 [英] How to handle loading of LARGE JSON files

查看:650
本文介绍了如何处理大JSON文件的加载的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直在研究WebGL应用程序,该应用程序需要大量的点数据才能绘制到屏幕上.当前,该点和面数据存储在我正在使用的Web服务器上,并且在页面加载时浏览器正在下载所有120MB的JSON.这在我的网络上花费了超过一分钟的时间,这不是最佳选择.我想知道是否有人对加载如此大的数据有任何经验/提示.我尝试过消除尽可能多的空格,但这几乎没有减少文件大小.

I've been working on a WebGL application that requires a tremendous amount of point data to draw to the screen. Currently, that point and surface data is stored on the webserver I am using, and ALL 120MB of JSON is being downloaded by the browser on page load. This takes well over a minute on my network, which is not optimal. I was wondering if anyone has any experience/tips about loading data this large. I've tried eliminating as much whitespace as possible, but that barely made a dent in file size.

是否有任何方法可以极大地压缩此文件,或者以其他更好的方式下载如此大量的数据?任何帮助都会很棒!

Is there any way to either compress this file immensely, or otherwise a better way to download such a large amount of data? Any help would be great!

推荐答案

您可能会考虑的几件事:

A few things you might consider:

  1. 您的服务器在发送之前是否已压缩文件?

  1. Is your server compressing the file before sending?

此数据是否经常更改?如果不是,则可以将expires标头设置为很长的时间,以便浏览器可以将其保留在缓存中.这样做对首页访问没有帮助,但是在随后的页面访问中,不必再次加载文件.

Does this data change often? If it does not, you could set your expires header to a very long time, so the browser could keep it in cache. It wouldn't help on the first page access, but on subsequent ones the file wouldn't have to be loaded again.

您的json文件中是否有很多重复的内容?例如,如果对象键很长,则可以用较短的对象键替换它们,然后发送并在浏览器中再次替换.如果压缩文件(参见第1条),好处将不会那么大,但是取决于您的文件,它可能会有所帮助.

Is there a lot of repeating stuff in your json file? For instance, if your object keys are long, you could replace them with shorter ones, send, and replace again in the browser. The benefits will not be that great if the file is compressed (see item 1) but depending on your file it might help a little.

浏览器是否一次消耗了所有这些数据?如果不是,您可以尝试将其分解成较小的部分,并在其他部分加载时开始处理前几个部分.

Is all this data consumed by the browser at once? If it's not, you could try breaking it down into smaller pieces, and start processing the first parts while the others load.

但最重要的是:您确定JSON是适合此工作的正确工具吗?通用压缩工具只能走这么远的距离,但是如果您探索数据的特定特征,则可能会获得更好的结果.如果您提供有关所使用格式的更多详细信息,我们可能会为您提供更多帮助.

But the most important: are you sure JSON is the right tool for this job? A general purpose compression tool can only go so far, but if you explore the particular characteristics of your data you might be able to achieve better results. If you give more details on the format you're using we may be able to help you more.

这篇关于如何处理大JSON文件的加载的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆