快速将多个图像一次性保存到文件系统中 [英] Swift saving multiple images to the filesystem at once, High CPU
问题描述
我具有以下功能,我需要清除数据库中的所有图片列,然后移至文件系统.当我一次完成所有这些操作时,内存太多了,它会崩溃.我切换到递归函数,并进行20次的写和批处理.
I have the following func, I need to clear out all picture columns in a DB and move to the file system. When I did this all in one go, there was too much memory and it would crash. I switched to a recursive function and do the writes and batches of 20.
我需要为此做大约6张桌子.我的Realm数据库中有2个半月的数据.我调用了6个递归函数后,将其切换为40mb,取出图像并压缩Realm.
There are about 6 tables which I need to do this for. There is 2 and half gigs of data in my Realm DB. This gets switched to 40mb after I call my 6 recursive functions, taking images out and compressing Realm.
由于调用了我的函数,我看到的内存使用率很高,而RAM较少的手机将无法处理它.
I can see very high memory usage as my functions are called and phones with less RAM would not be able to handle it.
如何在每个功能之间释放内存和CPU?
How can I free up memory and CPU in between each function?
public static func clearEqCatPics(){
let docsDir = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
let eqcatPicDir = docsDir.appendingPathComponent(util_Constants.DIR_EQCAT_PICS)
do {
var realm : Realm? = try Realm()
let predicate = NSPredicate(format: "icon != %@", "")
let categories = realm!.objects(STD_EQ_category.self).filter(predicate).sorted( by: [SortDescriptor(keyPath: "displayorder", ascending: true), SortDescriptor(keyPath: "_id", ascending: true)] )
if (categories.count > 0)
{
realm?.beginWrite()
let upper = categories.count > 20 ? 20 : categories.count
var actualCounter = upper
for i in 0..<upper{
autoreleasepool{
if let proPicData = Data(base64Encoded: categories[actualCounter - 1].icon, options: .ignoreUnknownCharacters) {
let filename = eqcatPicDir.appendingPathComponent(categories[actualCounter - 1]._id.description+".jpg")
(proPicData as NSData).writeToURL2(named: filename, completion: { (result, url) -> Void in
})
categories[actualCounter - 1].icon = ""
}
else{
categories[actualCounter - 1].icon = ""
}
}
actualCounter = actualCounter - 1
}
try realm?.commitWrite()
let eqcatNew = realm!.objects(STD_EQ_category.self).filter(predicate)
print("$$$$$$$$$$$$$$$$$$$$ 2. eqcatNew COUNT : \(eqcatNew.count) $$$$$$$$$$$$$$$$$$$$")
realm = nil
if eqcatNew.count > 0 {
clearEqCatPics()
}
}
realm = nil
}
catch let error as NSError {
print("error realm \(error.localizedDescription)")
}
}
writeToURL2在哪里:
where writeToURL2 IS:
我需要摆脱扩展中的弱小自我,因为我已经越过了守卫,让多个物品和货物被跳过了
I needed to get rid of the weak self in my extension because I was getting past the guard let for multiple items and loads were being skipped
extension NSData {
func writeToURL2(named:URL, completion: @escaping (_ result: Bool, _ url:NSURL?) -> Void) {
let tmpURL = named as NSURL
//[weak self]
DispatchQueue.global(qos: .background).async { () -> Void in
//guard let strongSelf = self else { print("self was weak"); completion (false, tmpURL); return }
self.write(to: tmpURL as URL, atomically: true)
var error:NSError?
if tmpURL.checkResourceIsReachableAndReturnError(&error) {
print("We have it")
completion(true, tmpURL)
} else {
print("We Don't have it\(error?.localizedDescription)")
completion (false, tmpURL)
}
}
}
}
编辑:
我将for循环中的writeToURL更改为以下内容:
I changed my writeToURL in my for loop to the following:
do {
try proPicData.write(to: filename, options: [.atomic])
}
catch let err as NSError{
print("err : \(err.localizedDescription)")
}
它有助于内存,但有时我会得到Thread1:EXC_BAD_ACCESS指向该行尝试proPicData.write ...
It helped with memory, But sometimes I get Thread1: EXC_BAD_ACCESS pointing to the line try proPicData.write...
仍然具有很高的CPU使用率.无论如何,是否需要清除每个函数调用之间的CPU使用率?
Still have very high CPU usage. Is there anyway to clear out CPU usage in between each function call?
推荐答案
您正在同时获取Realm
中的所有对象,这就是消耗了太多内存的原因
You are fetching all of the objects in your Realm
at the same time, that is what is using up so much memory
let categories = realm!.objects(STD_EQ_category.self).filter(predicate).sorted( by: [SortDescriptor(keyPath: "displayorder", ascending: true), SortDescriptor(keyPath: "_id", ascending: true)] )
写入文件不是消耗内存,尽管这可能与CPU使用率有关.
The writing to files is not what is using up memory, although that might have something to do with the CPU usage.
我建议在获取的类别上放置limit
,以免将所有类别及其图像加载到内存中.否则,请尝试提出一个提取谓词,否则该谓词将以一种明智的方式限制数据.
I would recommend putting a limit
on the categories fetch so that you don't load all categories and their images into memory. Otherwise, try to come up with a fetch predicate that would otherwise limit the data in a sensible way.
这篇关于快速将多个图像一次性保存到文件系统中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!