如何使用流来JSON字符串化Node.js中的大型嵌套对象? [英] How to use streams to JSON stringify large nested objects in Node.js?
问题描述
我有一个大型的javascript对象,我想将其转换为JSON并写入文件。我想我可以使用这样的流来做到这一点
I have a large javascript object that I want to convert to JSON and write to a file. I thought I could do this using streams like so
var fs = require('fs');
var JSONStream = require('JSONStream');
var st = JSONStream.stringifyObject()
.pipe(fs.createWriteStream('./output_file.js'))
st.write(large_object);
当我尝试这个时,我收到一个错误:
When I try this I get an error:
stream.js:94
throw er; // Unhandled stream error in pipe.
^
TypeError: Invalid non-string/buffer chunk
at validChunk (_stream_writable.js:153:14)
at WriteStream.Writable.write (_stream_writable.js:182:12)
显然我不能只写一个对象 stringifyObject
。我不确定下一步是什么。我需要将对象转换为缓冲区?通过一些转换流运行对象并将其传递给 strinigfyObject
So apparently I cant just write an object to this stringifyObject
. I'm not sure what the next step is. I need to convert the object to a buffer? Run the object through some conversion stream and pipe it to strinigfyObject
推荐答案
JSONStream无法正常工作,但由于您的大对象已经加载到内存中,因此没有任何意义。
JSONStream doesn't work that way but since your large object is already loaded into memory there is no point to that.
var fs = require('fs-extra')
var file = '/tmp/this/path/does/not/exist/file.txt'
fs.outputJson(file, {name: 'JP'}, function (err) {
console.log(err) // => null
});
这将写下JSON。
如果你想使用JSONStream,你可以这样做:
If you want to use JSONStream you could do something like this:
var fs = require('fs');
var jsonStream = require('JSONStream');
var fl = fs.createWriteStream('dat.json');
var out = jsonStream.stringifyObject();
out.pipe(fl);
obj = { test:10, ok: true };
for (key in obj) out.write([key, obj[key]]);
out.end();
这篇关于如何使用流来JSON字符串化Node.js中的大型嵌套对象?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!