JDBC批量插入的OutOfMemoryError [英] JDBC Batch Insert OutOfMemoryError

查看:718
本文介绍了JDBC批量插入的OutOfMemoryError的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我写了一个方法插入()中,我试图使用JDBC批处理插入五十万条记录到MySQL数据库:

I have written a method insert() in which I am trying to use JDBC Batch for inserting half a million records into a MySQL database:

public void insert(int nameListId, String[] names) {
        String sql = "INSERT INTO name_list_subscribers (name_list_id, name, date_added)"+
                     " VALUES (?, ?, NOW())";
        Connection conn = null;
        PreparedStatement ps = null;

        try{
            conn = getConnection();
            ps = conn.prepareStatement(sql);

            for(String s : names ){
                ps.setInt(1, nameListId); 
                ps.setString(2, s);
                ps.addBatch();
            }

            ps.executeBatch();

        }catch(SQLException e){
            throw new RuntimeException(e);
        }finally{
            closeDbResources(ps, null, conn);
        }
    }

但每当我试图运行这个方法,我得到以下错误:

But whenever I try to run this method, I get the following error:

java.lang.OutOfMemoryError: Java heap space
    com.mysql.jdbc.ServerPreparedStatement$BatchedBindValues.<init>(ServerPreparedStatement.java:72)
    com.mysql.jdbc.ServerPreparedStatement.addBatch(ServerPreparedStatement.java:330)
    org.apache.commons.dbcp.DelegatingPreparedStatement.addBatch(DelegatingPreparedStatement.java:171)

如果我更换 ps.addBatch() ps.executeUpdate()并删除 ps.executeBatch(),它工作正常,但需要一定的时间。请让我知道,如果你知道,如果使用批处理是在这种情况下适当的,如果是的话,为什么不把它给 OurOfMemoryError

If I replace ps.addBatch() with ps.executeUpdate() and remove ps.executeBatch(), it works fine, though it takes some time. Please let me know if you know if using Batch is appropriate in this situation, and if it is, then why does it give OurOfMemoryError?

感谢

推荐答案

addBatch executeBatch 给你的机理执行批量插入,但你仍然需要做配料算法自己。

addBatch and executeBatch give you the mechanism to perform batch inserts, but you still need to do the batching algorithm yourself.

如果你只是堆每条语句到同一批次的,因为你正在做的,那么你会耗尽内存。您需要执行/清除批次每个 N 记录。 n的值是你的,JDBC不能作出这样的决定你。本批大小越大,速度越快的东西会走,但过大,你会得到内存不足和东西会减慢或失效。这取决于你有多少内存。

If you simply pile every statement into the same batch, as you are doing, then you'll run out of memory. You need to execute/clear the batch every n records. The value of n is up to you, JDBC can't make that decision for you. The larger the batch size, the faster things will go, but too large and you'll get memory starvation and things will slow down or fail. It depends how much memory you have.

与1000的批量大小,例如刚开始时,并从那里与不同的值进行试验。

Start off with a batch size of 1000, for example, and experiment with different values from there.

final int batchSize = 1000;
int count = 0;
for(String s : names ) {
   ps.setInt(1, nameListId); 
   ps.setString(2, s);
   ps.addBatch();

   if (++count % batchSize == 0) {
      ps.executeBatch();
      ps.clearBatch(); //not sure if this is necessary
   }
}
ps.executeBatch();   // flush the last few records.

这篇关于JDBC批量插入的OutOfMemoryError的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆