UIImageJPEGRepresentation 收到内存警告 [英] UIImageJPEGRepresentation received memory warning

查看:53
本文介绍了UIImageJPEGRepresentation 收到内存警告的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在使用 UIImageJPEGRepresentation 时收到内存警告,有什么办法可以避免这种情况?它不会使应用程序崩溃,但如果可能的话,我想避免它.它间歇性地不运行 [[UIApplication sharedApplication] openURL:url];

I receive a memory warning when using UIImageJPEGRepresentation, is there any way to avoid this? It doesn't crash the app but I'd like to avoid it if possible. It does intermittently not run the [[UIApplication sharedApplication] openURL:url];

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    UIImage *image = [info valueForKey:UIImagePickerControllerOriginalImage];
    NSData *imageToUpload = UIImageJPEGRepresentation(image, 1.0);

    // code that sends the image to a web service (omitted)
    // on success from the service
    // this sometime does not get run, I assume it has to do with the memory warning?
    [[UIApplication sharedApplication] openURL:url];
}

推荐答案

使用 UIImageJPEGRepresentation(其中您通过 UIImage 来回传输资产)可能会出现问题,因为使用 1.0 的 compressionQuality,结果 NSData 实际上可能比原始文件大得多.(另外,您在 UIImage 中持有图像的第二个副本.)

Using UIImageJPEGRepresentation (in which you are round-tripping the asset through a UIImage) can be problematic, because using a compressionQuality of 1.0, the resulting NSData can actually be considerably larger than the original file. (Plus, you're holding a second copy of the image in the UIImage.)

例如,我刚刚从我的 iPhone 的照片库中随机选择了一张图片,原始资产为 1.5mb,但是 UIImageJPEGRepresentation 生成的 NSData 带有 1.0的compressionQuality需要6.2mb.将图像保存在 UIImage 中,本身可能会占用更多内存(因为如果未压缩,它可能需要,例如,每个像素四个字节).

For example, I just picked a random image from my iPhone's photo library and the original asset was 1.5mb, but the NSData produced by UIImageJPEGRepresentation with a compressionQuality of 1.0 required 6.2mb. And holding the image in UIImage, itself, might take even more memory (because if uncompressed, it can require, for example, four bytes per pixel).

相反,您可以使用 getBytes 方法获取原始资产:

Instead, you can get the original asset using the getBytes method:

static NSInteger kBufferSize = 1024 * 10;

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    NSURL *url = info[UIImagePickerControllerReferenceURL];

    [self.library assetForURL:url resultBlock:^(ALAsset *asset) {
        ALAssetRepresentation *representation = [asset defaultRepresentation];
        long long remaining = representation.size;
        NSString *filename  = representation.filename;

        long long representationOffset = 0ll;
        NSError *error;
        NSMutableData *data = [NSMutableData data];

        uint8_t buffer[kBufferSize];

        while (remaining > 0ll) {
            NSInteger bytesRetrieved = [representation getBytes:buffer fromOffset:representationOffset length:sizeof(buffer) error:&error];
            if (bytesRetrieved <= 0) {
                NSLog(@"failed getBytes: %@", error);
                return;
            } else {
                remaining -= bytesRetrieved;
                representationOffset += bytesRetrieved;
                [data appendBytes:buffer length:bytesRetrieved];
            }
        }

        // you can now use the `NSData`

    } failureBlock:^(NSError *error) {
        NSLog(@"assetForURL error = %@", error);
    }];
}

这避免了在 UIImage 中暂存图像,并且生成的 NSData 可以(无论如何对于照片)小得多.请注意,这也有一个优点,它也保留了与图像关联的元数据.

This avoids staging the image in a UIImage and the resulting NSData can be (for photos, anyway) considerably smaller. Note, this also has an advantage that it preserves the meta data associated with the image, too.

顺便说一句,虽然上述内容代表了显着的内存改进,但您可能会看到更显着的内存减少机会:具体来说,不是一次将整个资产加载到 NSData 中,您现在可以流式传输资产(子类 NSInputStream 使用此 getBytes 例程在需要时获取字节,而不是一次将整个内容加载到内存中).这个过程有一些烦恼(参见 BJ Homer 关于该主题的文章),但是如果您正在寻找显着减少内存占用的方法,那就是方法.这里有几种方法(BJ 的,使用一些暂存文件并从中流式传输等),但关键是流式传输可以显着减少您的内存占用.

By the way, while the above represents a significant memory improvement, you can probably see a more dramatic memory reduction opportunity: Specifically, rather than loading the entire asset into a NSData at one time, you can now stream the asset (subclass NSInputStream to use this getBytes routine to fetch bytes as they're needed, rather than loading the whole thing into memory at one time). There are some annoyances involved with this process (see BJ Homer's article on the topic), but if you're looking for dramatic reduction in the memory footprint, that's the way. There are a couple of approaches here (BJ's, using some staging file and streaming from that, etc.), but the key is that streaming can dramatically reduce your memory footprint.

但是通过避免 UIImageJPEGRepresentation 中的 UIImage(这避免了图像占用的内存以及 UIImageJPEGRepresentation 产生),您可能会取得很大进展.此外,您可能希望确保一次在内存中没有此图像数据的冗余副本(例如,不要将图像数据加载到 NSData 中,然后再构建第二个NSData 用于 HTTPBody ......看看你是否可以一举完成).如果情况变得更糟,您可以采用流式传输方法.

But by avoiding UIImage in UIImageJPEGRepresentation (which avoids the memory taken up by the image as well as the larger NSData that UIImageJPEGRepresentation yields), you might be able to make considerably headway. Also, you might want to make sure that you don't have redundant copies of this image data in memory at one time (e.g. don't load the image data into a NSData, and then build a second NSData for the HTTPBody ... see if you can do it in one fell swoop). And if worst comes to worse, you can pursue streaming approaches.

这篇关于UIImageJPEGRepresentation 收到内存警告的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆