Swift 中大量核心数据批量插入的内存泄漏 [英] Memory leak with large Core Data batch insert in Swift

查看:18
本文介绍了Swift 中大量核心数据批量插入的内存泄漏的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我将数万个对象插入到我的 Core Data 实体中.我有一个 NSManagedObjectContext 并且每次添加对象时我都会在托管对象上下文中调用 save() .它可以工作,但是在运行时,内存从大约 27M 增加到 400M.并且在导入完成后还停留在400M.

关于批量插入有很多SO问题,每个人都说要阅读

进一步阅读

更新

以上答案完全改写.感谢@Mundi 和@MartinR 在评论中指出了我原来答案中的一个错误.感谢@JodyHagins 在这个答案中帮助我理解和解决问题.

I am inserting tens of thousands of objects into my Core Data entity. I have a single NSManagedObjectContext and I am calling save() on the managed object context every time I add an object. It works but while it is running, the memory keeps increasing from about 27M to 400M. And it stays at 400M even after the import is finished.

There are a number of SO questions about batch insert and everyone says to read Efficiently Importing Data, but it's in Objective-C and I am having trouble finding real examples in Swift that solve this problem.

解决方案

There are a few things you should change:

  • Create a separate NSPrivateQueueConcurrencyType managed object context and do your inserts asynchronously in it.
  • Don't save after inserting every single entity object. Insert your objects in batches and then save each batch. A batch size might be something like 1000 objects.
  • Use autoreleasepool and reset to empty the objects in memory after each batch insert and save.

Here is how this might work:

let managedObjectContext = NSManagedObjectContext(concurrencyType: NSManagedObjectContextConcurrencyType.PrivateQueueConcurrencyType)
managedObjectContext.persistentStoreCoordinator = (UIApplication.sharedApplication().delegate as! AppDelegate).persistentStoreCoordinator // or wherever your coordinator is

managedObjectContext.performBlock { // runs asynchronously

    while(true) { // loop through each batch of inserts

        autoreleasepool {
            let array: Array<MyManagedObject>? = getNextBatchOfObjects()
            if array == nil { break }
            for item in array! {
                let newObject = NSEntityDescription.insertNewObjectForEntityForName("MyEntity", inManagedObjectContext: managedObjectContext) as! MyManagedObject
                newObject.attribute1 = item.whatever
                newObject.attribute2 = item.whoever
                newObject.attribute3 = item.whenever
            }
        }

        // only save once per batch insert
        do {
            try managedObjectContext.save()
        } catch {
            print(error)
        }

        managedObjectContext.reset()
    }
}

Applying these principles kept my memory usage low and also made the mass insert faster.

Further reading

  • Efficiently Importing Data (old Apple docs link is broken. If you can find it, please help me add it.)
  • Core Data Performance
  • Core Data (General Assembly post)

Update

The above answer is completely rewritten. Thanks to @Mundi and @MartinR in the comments for pointing out a mistake in my original answer. And thanks to @JodyHagins in this answer for helping me understand and solve the problem.

这篇关于Swift 中大量核心数据批量插入的内存泄漏的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆