Android java:OutOfMemory同时将一个巨大的列表插入数据库SQL [英] Android java: OutOfMemory while inserting a huge List into the database SQL

查看:18
本文介绍了Android java:OutOfMemory同时将一个巨大的列表插入数据库SQL的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在开发一个使用 ORM greendao 的 Android 项目.它允许我一次(在一个事务中)将多个实体(显然是对象)插入数据库.在实践中有一种方法

I'm just working on an Android Project where an ORM greendao is used. It allows me to insert a number of entities (which are obviously objects) into the database at once (in one transaction). In practice there's a method

插入或替换InTx(...)

insertOrReplaceInTx(...)

它接受一个集合作为参数,给定一个对象列表.问题是我的列表有大约 12000 个对象,并且插入不时会导致 OutOfMemory 异常.我正在考虑一种解决问题并防止未来OOM的聪明方法.我想到的唯一想法是将庞大的集合拆分为某种子集合(假设每个子集合 500 个元素)并在循环中提交几个小提交而不是一个巨大的提交.坏处是插入记录需要更长的时间,而且时间很重要.最后,我不确定在提交后提交是否不会杀死堆.也许在两者之间调用一个 sleep() 方法让 GC 清除堆......但在我看来它是一种丑陋的解决方案.

which takes as a parameter a collection, given a List of objects. The problem is that my list has sort of 12000 objects and an insert leads from time to time to an OutOfMemory Exception. I'm thinking of a smart way to solve the problem and prevent future OOM. The only idea which comes to my mind is to split the huge collection into sort of subcollections (let's say 500 elements each) and commit in a loop making several small commits instead of one huge. The bad thing about it is that it takes longer to insert the records and time matters here. In the end I'm not sure if making commit after commit doesn't kill the heap anyway. Maybe call a sleep() method in between to let GC clear the heap ...but it looks to me as a kind of ugly solution.

你有什么聪明的想法吗?提前致谢

any smart ideas on your side? thanks in advance

推荐答案

我正在使用 greendao 1.3.1 做一个项目.一些表包含大约 200000 个实体(不包含很多属性).

I am working on a project using greendao 1.3.1. Some of the tables containing about 200000 entities (not containing a lot of properties).

我从 csv 读取实体并加快速度,我开发了一个小型解决方案,这也可能有助于解决您的 OOM 问题.

I read the entities from csv and to speed things up I developed a small solution, which might also help with your OOM-issue.

解释:

greendao 使用缓存,在每次插入后更新实体以获取行 ID,并可能将实体插入其缓存中.最重要的是,如果您调用插入或更新方法并且还没有事务,则 greendao 会启动事务.这会减慢批量"插入速度并增加内存使用量并降低速度.

greendao uses a cache and after each insert it updates the entity to get the row-id and probably inserts the entity into its cache. On top of that greendao starts a transaction if you call an insert or an update method and if there isn't already a transaction. This slows down "bulk"-inserts and increases the memory usage and also reduces speed.

我做了什么:

为了解决问题,我在执行任何插入操作之前就开始了事务.这样,greendao 不会在每次插入时启动事务,并且所有插入和更新都在同一个事务中,这在数据一致性方面具有额外的好处.你可以使用这样的代码:

To fasten things up I started a transaction before I did any insert. This way greendao will not start a transaction on every insert and all inserts and updates are in the same transaction which has additional benefits concerning data consistency. You can use code like this:

SQLiteDatabase db = dao.getDatabase();
db.beginTransaction();

try {
    // do all your inserts and so on here.
    db.setTransactionSuccessful();
} catch (Exception ex) {
} finally {
    db.endTransaction();
}

但这还不能帮助您解决 OOM 问题.

But this won't help you with your OOM-problem yet.

解决方案 1

如果你不想弄乱greendao-code,你可以每隔一段时间发出一个DaoSession.clear().这绝对是一个简单的解决方案,但性能不如解决方案 2.

If you don't want to mess with the greendao-code you can issue a DaoSession.clear() every once in a while. This is definitely the simple solution, but will be less performant than solution 2.

解决方案 2

为了防止 greendao 更新实体并将其插入到其缓存中,您可以在 AbstractDao.javaprivate long executeInsert(T entity, SQLiteStatement stmt)>:

To prevent greendao from updateing and inserting the entity into its cache you can replace the method private long executeInsert(T entity, SQLiteStatement stmt) with this code in AbstractDao.java:

/**
 * Insert an entity into the table associated with a concrete DAO.
 * 
 * @return row ID of newly inserted entity
 */
public long insertOrReplace(T entity, boolean update) {
    return executeInsert(entity, statements.getInsertOrReplaceStatement(), update);
}

private long executeInsert(T entity, SQLiteStatement stmt) {
    return executeInsert(entity, stmt, true);
}

private long executeInsert(T entity, SQLiteStatement stmt, boolean update) {
    long rowId;
    if (db.isDbLockedByCurrentThread()) {
        synchronized (stmt) {
            bindValues(stmt, entity);
            rowId = stmt.executeInsert();
        }
    } else {
        // Do TX to acquire a connection before locking the stmt to avoid deadlocks
        db.beginTransaction();
        try {
            synchronized (stmt) {
                bindValues(stmt, entity);
                rowId = stmt.executeInsert();
            }
            db.setTransactionSuccessful();
        } finally {
            db.endTransaction();
        }
    }
    if (update) {
        updateKeyAfterInsertAndAttach(entity, rowId, true);
    }
    return rowId;
}

这篇关于Android java:OutOfMemory同时将一个巨大的列表插入数据库SQL的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆