将效果应用于iPhone相机预览“视频” [英] Applying Effect to iPhone Camera Preview "Video"

查看:256
本文介绍了将效果应用于iPhone相机预览“视频”的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的目标是编写一个自定义相机视图控制器:

My goal is to write a custom camera view controller that:


  1. 可以同时拍摄所有四个界面方向的照片,如果可用的话,前置摄像头。

  2. 正确旋转并缩放预览视频以及全分辨率照片。

  3. 允许(简单) )效果将同时应用于预览视频和全分辨率照片。

实施(在iOS 4.2 / Xcode 3.2.5上) ):

Implementation (on iOS 4.2 / Xcode 3.2.5):

由于要求(3),我需要下降到AVFoundation。

Due to requirement (3), I needed to drop down to AVFoundation.

我开始时技术Q& A QA1702 并进行了以下更改:

I started with Technical Q&A QA1702 and made these changes:


  1. 将sessionPreset更改为AVCaptureSessionPresetPhoto。

  2. 在开始会话之前添加AVCaptureStillImageOutput作为附加输出。

我遇到的问题是性能问题处理预览图像(预览视频的一帧)。

The issue that I am having is with the performance of processing the preview image (a frame of the preview "video").

首先,我得到的UIImage结果imageFromSampleBuffer: captureOutput:didOutputSampleBuffer:fromConnection:的示例缓冲区上的c $ c>。然后,我使用CGGraphicsContext对屏幕进行缩放和旋转。

First, I get the UIImage result of imageFromSampleBuffer: on the sample buffer from captureOutput:didOutputSampleBuffer:fromConnection:. Then, I scale and rotate it for the screen using a CGGraphicsContext.

此时,帧速率已低于视频输出中指定的15 FPS。会话,当我添加效果时,它会降到10或者10左右。很快,应用程序因内存不足而崩溃。

At this point, the frame rate is already under the 15 FPS that is specified in the video output of the session and when I add in the effect, it drops to under or around 10. Quickly the app crashes due to low memory.

我已经取得了一些成功iPhone 4上的帧速率为9 FPS,iPod Touch上的帧速率为8 FPS(第4代)。

I have had some success with dropping the frame rate to 9 FPS on the iPhone 4 and 8 FPS on the iPod Touch (4th gen).

我还添加了一些代码来刷新调度队列,但我不确定它实际上有多大帮助。基本上,每隔8-10帧,设置一个标志信号 captureOutput:didOutputSampleBuffer:fromConnection:立即返回而不是处理帧。在输出调度队列上的同步操作完成后,该标志被重置。

I have also added in some code to "flush" the dispatch queue, but I am not sure how much it is actually helping. Basically, every 8-10 frames, a flag is set that signals captureOutput:didOutputSampleBuffer:fromConnection: to return right away rather than process the frame. The flag is reset after a sync operation on the output dispatch queue finishes.

此时我甚至不介意低帧速率,但显然我们不能低内存崩溃的船只。任何人都知道如何采取措施来防止这种情况下的内存不足(和/或更好地刷新调度队列)?

At this point I don't even mind the low frame rates, but obviously we can't ship with the low memory crashes. Anyone have any idea how to take action to prevent the low memory conditions in this case (and/or a better way to "flush" the dispatch queue)?

推荐答案

要防止内存问题,只需在 captureOutput:didOutputSampleBuffer:fromConnection:中创建一个自动释放池。

To prevent the memory issues, simply create an autorelease pool in captureOutput:didOutputSampleBuffer:fromConnection:.

这是有道理的,因为 imageFromSampleBuffer:返回一个自动释放的UIImage对象。另外,它可以立即释放由图像处理代码创建的任何自动释放的对象。

This makes sense since imageFromSampleBuffer: returns an autoreleased UIImage object. Plus it frees up any autoreleased objects created by image processing code right away.

// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
fromConnection:(AVCaptureConnection *)connection
{ 
    NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];

    // Create a UIImage from the sample buffer data
    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];

    < Add your code here that uses the image >

    [pool release];
}

我的测试显示,这将在iPhone 4上没有内存警告的情况下运行iPod Touch(第4代)即使请求FPS非常高(例如60)并且图像处理非常慢(例如0.5秒)。

My testing has shown that this will run without memory warnings on an iPhone 4 or iPod Touch (4th gen) even if requested FPS is very high (e.g. 60) and image processing is very slow (e.g. 0.5+ secs).

旧解决方案

正如Brad指出的那样,Apple建议将图像处理放在后台线程上,以免干扰UI响应。在这种情况下我没有注意到多少滞后,但最佳实践是最佳实践,因此请使用上面的解决方案和自动释放池,而不是在主调度队列/主线程上运行它。

As Brad pointed out, Apple recommends image processing be on a background thread so as to not interfere with the UI responsiveness. I didn't notice much lag in this case, but best practices are best practices, so use the above solution with autorelease pool instead of running this on the main dispatch queue / main thread.

为了防止内存问题,只需使用主调度队列而不是创建新的队列。

To prevent the memory issues, simply use the main dispatch queue instead of creating a new one.

这也意味着您不必切换到当你想要更新UI时, captureOutput:didOutputSampleBuffer:fromConnection:中的主线程。

This also means that you don't have to switch to the main thread in captureOutput:didOutputSampleBuffer:fromConnection: when you want to update the UI.

In setupCaptureSession ,更改FROM:

In setupCaptureSession, change FROM:

// Configure your output.
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);

TO:

// we want our dispatch to be on the main thread
[output setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

这篇关于将效果应用于iPhone相机预览“视频”的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆