解码大数据流JSON [英] Decode large stream JSON
本文介绍了解码大数据流JSON的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我在文件("file.json")中存储了一个庞大的JSON数组 我需要遍历数组并对每个元素进行一些操作.
I have a massive JSON array stored in a file ("file.json") I need to iterate through the array and do some operation on each element.
err = json.Unmarshal(dat, &all_data)
造成内存不足-我猜是因为它首先将所有内容加载到内存中.
Causes an out of memory - I'm guessing because it loads everything into memory first.
有没有一种方法可以逐个元素地传输JSON元素?
Is there a way to stream the JSON element by element?
推荐答案
这里有一个这样的例子: https://golang.org/pkg/encoding/json/#example_Decoder_Decode_stream .
There is an example of this sort of thing here: https://golang.org/pkg/encoding/json/#example_Decoder_Decode_stream.
package main
import (
"encoding/json"
"fmt"
"log"
"strings"
)
func main() {
const jsonStream = `
[
{"Name": "Ed", "Text": "Knock knock."},
{"Name": "Sam", "Text": "Who's there?"},
{"Name": "Ed", "Text": "Go fmt."},
{"Name": "Sam", "Text": "Go fmt who?"},
{"Name": "Ed", "Text": "Go fmt yourself!"}
]
`
type Message struct {
Name, Text string
}
dec := json.NewDecoder(strings.NewReader(jsonStream))
// read open bracket
t, err := dec.Token()
if err != nil {
log.Fatal(err)
}
fmt.Printf("%T: %v\n", t, t)
// while the array contains values
for dec.More() {
var m Message
// decode an array value (Message)
err := dec.Decode(&m)
if err != nil {
log.Fatal(err)
}
fmt.Printf("%v: %v\n", m.Name, m.Text)
}
// read closing bracket
t, err = dec.Token()
if err != nil {
log.Fatal(err)
}
fmt.Printf("%T: %v\n", t, t)
}
这篇关于解码大数据流JSON的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文