带超大JSON文件的工作。返回总是出内存 [英] Working with extra large json files. Return always out of memory

查看:1116
本文介绍了带超大JSON文件的工作。返回总是出内存的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如何使用与杰克逊流API JSON?看到我的code以下:

How to use the stream API json with jackson? See my code below:

    ObjectMapper mapper = new ObjectMapper();

    Map<String, Object> map = new HashMap<String, Object>();

    List<Object> list = new ArrayList<Object>();

    // Get images in database
    try {
            Class.forName(DRIVER);
            connection = DriverManager.getConnection(URL, USER, PASSWORD);

            Statement s = connection.createStatement();
            ResultSet r = s.executeQuery("select * from images");

            while (r.next()) {

                byte[] imageBytes = r.getBytes("image");
                String imageBase64 = DatatypeConverter.printBase64Binary(imageBytes);
                list.add(imageBase64);
            }

    } catch (SQLException e) {

    }

    map.put("images", list);

    // Stream Json API
    try {
            mapper.writeValue(new File("c:\\images.json"), map);
    } catch (JsonGenerationException e) {
            e.printStackTrace();
    } catch (JsonMappingException e) {
            e.printStackTrace();
    } catch (IOException e) {
            e.printStackTrace();
    }

总是返回内存不足。我不知道使用流与杰克逊。我带超大JSON,平均2000图像时,每个图像imageBase64。我究竟做错了什么?

Always return out of memory. I dont know to use stream with jackson. I working with extra large json, average 2000 images, to each image a imageBase64. what am I doing wrong?

推荐答案

而不是保存在内存中的所有图像,只是读并逐步写。
杰克逊流API的例子可以发现这里(阅读与写作事件流)。

Instead of keeping all images in memory, just read and write them incrementally. An example of Jackson Streaming API can be found here ("Reading and Writing Event Streams").

编辑:这应该是太难搞清楚人......但这里有一个骨骼例如:

This should be too hard to figure out folks... but here's a skeletal example:

// typed from memory, some methods may be off a bit
JsonFactory f = objectMapper.getFactory();
JsonGenerator gen = f.createGenerator(new File("c:\\images.json"));
gen.writeStartArray(); // to get array of objects
// get the DB connection etc
while (r.next()) {
  gen.writeFieldName("image");
  InputStream in = r.getBinaryStream("image");
  gen.writeBinary(in, -1); // length optional for JSON
  in.close();
}

gen.writeEndArray(); //获取对象数组
    gen.close();

    gen.writeEndArray(); // to get array of objects gen.close();

和应该对这样的伎俩。

这篇关于带超大JSON文件的工作。返回总是出内存的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆