使用Node.js JSON中的字符串大小是否有限制? [英] Is there a limit on the size of a string in JSON with Node.js?

查看:149
本文介绍了使用Node.js JSON中的字符串大小是否有限制?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的Node.js应用程序的一部分涉及从用户接收字符串作为输入并将其存储在JSON文件中。 JSON本身显然对此没有限制,但是Node可以处理成JSON的文本量是否有上限?

A section of my Node.js application involves receiving a string as input from the user and storing it in a JSON file. JSON itself obviously has no limit on this, but is there any upper bound on the amount of text that Node can process into JSON?

请注意,我不是使用MongoDB或任何其他技术进行实际插入 - 这是本机字符串化并使用 fs 保存到.json文件。

Note that I am not using MongoDB or any other technology for the actual insertion - this is native stringification and saving to a .json file using fs.

推荐答案

V8(构建了JavaScript引擎节点),直到 very 最近有堆大小约为1.9 GB的硬限制

V8 (the JavaScript engine node is built upon) until very recently had a hard limit on heap size of about 1.9 GB.

节点v0.10由于打破了本机插件周围的V8 API更改,因此卡在旧版本的V8(3.14)上。节点0.12将更新到最新的V8(3.26),这将打破许多本机模块,但打开了提升1.9 GB堆限制的大门。

Node v0.10 is stuck on an older version of V8 (3.14) due to breaking V8 API changes around native addons. Node 0.12 will update to the newest V8 (3.26), which will break many native modules, but opens the door for the 1.9 GB heap limit to be raised.

所以它代表,单个节点进程可以保存不超过1.9 GB的JavaScript代码,对象,字符串等组合。这意味着字符串的最大长度低于1.9 GB。

So as it stands, a single node process can keep no more than 1.9 GB of JavaScript code, objects, strings, etc combined. That means the maximum length of a string is under 1.9 GB.

可以通过使用 Buffer <来解决这个问题/ code> s,它将数据存储在V8堆之外(但仍在进程的堆中)。只要在JavaScript变量中没有超过1.9 GB的数据,64位构建的节点几乎可以填满所有RAM。

You can get around this by using Buffers, which store data outside of the V8 heap (but still in your process's heap). A 64-bit build of node can pretty much fill all your RAM as long as you never have more than 1.9 GB of data in JavaScript variables.

所有这一切,你永远不应该接近这个限制。处理这么多数据时,必须以流的形式处理它。一次内存永远不会超过几兆(最多)。好消息是节点特别适合处理流数据。

All that said, you should never come anywhere near this limit. When dealing with this much data, you must deal with it as a stream. You should never have more than a few megabytes (at most) in memory at one time. The good news is node is especially well-suited to dealing with streaming data.

你应该问自己一些问题:

You should ask yourself some questions:


  • 您实际从用户那里收到了哪些数据?

  • 您为什么要以JSON格式存储它?

  • 将千兆字节填入JSON真的是个好主意吗? (答案是否定的。)

  • 数据存储后,数据会发生什么?你的代码会读它吗?还有什么?

  • What kind of data are you actually receiving from the user?
  • Why do you want to store it in JSON format?
  • Is it really a good idea to stuff gigabytes into JSON? (The answer is no.)
  • What will happen with the data later, after it is stored? Will your code read it? Something else?

你发布的问题实际上是非常模糊的,你实际上要完成什么。有关更具体的建议,请使用更多信息更新您的问题。

The question you've posted is actually quite vague in regard to what you're actually trying to accomplish. For more specific advice, update your question with more information.

如果您希望数据永远不会那么大,那么只需要合理限制10 MB或其他内容输入,缓冲所有,并使用 JSON.stringify

If you expect the data to never be all that big, just throw a reasonable limit of 10 MB or something on the input, buffer it all, and use JSON.stringify.

如果您希望处理更大的数据,您需要将输入直接流式传输到磁盘。如果您需要在数据进入磁盘之前处理/修改数据,请查看转换流。例如,有处理流式JSON的模块

If you expect to deal with data any larger, you need to stream the input straight to disk. Look in to transform streams if you need to process/modify the data before it goes to disk. For example, there are modules that deal with streaming JSON.

这篇关于使用Node.js JSON中的字符串大小是否有限制?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆