将图像转换为视频时,iOS- CVPixelBufferCreate内存无法正确释放 [英] iOS- CVPixelBufferCreate memory cannot release correctly when making image to video

查看:1197
本文介绍了将图像转换为视频时,iOS- CVPixelBufferCreate内存无法正确释放的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在将图像转换成视频. 但是总是由于内存警告而崩溃,因为在CVPixelBufferCreate上分配过多. 不知道如何正确处理.我见过很多类似的主题,但没有一个解决了我的问题.

I am making images into video. But always crashed because of memory warning, too much allocation on CVPixelBufferCreate. don't know how to handle it right. I've seen lot of similar topics and none of them solved my problem.

这是我的代码:

- (void) writeImagesArray:(NSArray*)array asMovie:(NSString*)path
{
    NSError *error  = nil;
    UIImage *first = [array objectAtIndex:0];
    CGSize frameSize = first.size;
    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                  [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                              error:&error];
    NSParameterAssert(videoWriter);

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   [NSNumber numberWithDouble:frameSize.width],AVVideoWidthKey,
                                   [NSNumber numberWithDouble:frameSize.height], AVVideoHeightKey,
                                   nil];

    AVAssetWriterInput* writerInput = [AVAssetWriterInput
                                       assetWriterInputWithMediaType:AVMediaTypeVideo
                                       outputSettings:videoSettings];

    self.adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                     sourcePixelBufferAttributes:nil];

    [videoWriter addInput:writerInput];

    //Start Session
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:kCMTimeZero];

    int frameCount = 0;
    CVPixelBufferRef buffer = NULL;
    for(UIImage *img in array)
    {
        buffer = [self newPixelBufferFromCGImage:[img CGImage] andFrameSize:frameSize];
        if (self.adaptor.assetWriterInput.readyForMoreMediaData)
        {
            CMTime frameTime =  CMTimeMake(frameCount,FPS);
            [self.adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
        }
        if(buffer)
            CVPixelBufferRelease(buffer);

        frameCount++;
    }

    [writerInput markAsFinished];
    [videoWriter finishWritingWithCompletionHandler:^{

        if (videoWriter.status == AVAssetWriterStatusFailed) {

            NSLog(@"Movie save failed.");

        }else{

            NSLog(@"Movie saved.");
        }
    }];

    NSLog(@"Finished.);
}


- (CVPixelBufferRef)newPixelBufferFromCGImage: (CGImageRef) image andFrameSize:(CGSize)frameSize
{
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                             nil];

    CVPixelBufferRef pxbuffer = NULL;

    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
                                          frameSize.width,
                                          frameSize.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
                                          &pxbuffer);

    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);

    CGBitmapInfo bitmapInfo = (CGBitmapInfo) kCGImageAlphaNoneSkipFirst;
    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata,
                                                 frameSize.width,
                                                 frameSize.height,
                                                 8,
                                                 4*frameSize.width,
                                                 rgbColorSpace,
                                                 bitmapInfo);

    NSParameterAssert(context);
    CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
                                           CGImageGetHeight(image)), image);
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);
    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}

更新:

我将视频分成了小段. 添加[NSThread sleepForTimeInterval:0.00005]之后;在循环. 记忆刚刚神奇地释放了.

I made my video into small segments. After adding a [NSThread sleepForTimeInterval:0.00005]; in the loop. the memory just magically released.

但是,这导致我的用户界面由于此行而停留了几秒钟.有更好的解决方案吗?

But, this cause my UI stuck for seconds because of this line. Any better solution?

for(UIImage *img in array)
{
    buffer = [self newPixelBufferFromCGImage:[img CGImage] andFrameSize:frameSize];
    //CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, adaptor.pixelBufferPool, &buffer);
    if (adaptor.assetWriterInput.readyForMoreMediaData)
    {
        CMTime frameTime =  CMTimeMake(frameCount,FPS);
        [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
    }

    if(buffer)
        CVPixelBufferRelease(buffer);

    frameCount++;

    [NSThread sleepForTimeInterval:0.00005];
}

这是记忆:

推荐答案

通过快速检查您的代码,我看不到CVBuffer本身的管理中有什么问题.
我认为这可能是您的问题的来源是UIImages数组.
UIImage具有此行为,直到您请求CGImage属性或对其进行绘制之前,附加的图像不会在内存中解码,因此未使用的图像对内存的影响很小. 您的枚举在每个图像上调用CGImage属性,并且您永远不会摆脱它们,这可以解释内存分配的持续增加.

From a fast review of your code, I can't see anything wrong in the management of the CVBuffer itself.
What I think it could be the source of your issue is the array of UIImages.
UIImage has this behavior, until you request the CGImage property or draw it, the attached image is not decoded in memory, so the impact in memory of unused images is low.
Your enumeration calls the CGImage property on each image and you never get rid of them, this can explain the continue increase of memory allocation.

这篇关于将图像转换为视频时,iOS- CVPixelBufferCreate内存无法正确释放的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆