如何从iPhone相机进行快速图像处理? [英] How can I do fast image processing from the iPhone camera?

查看:250
本文介绍了如何从iPhone相机进行快速图像处理?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想写一个iPhone应用程序,它会做一些实时相机图像处理。我使用AVFoundation文档中提供的示例作为起点:设置捕获会话,从示例缓冲区数据创建UIImage,然后通过 -setNeedsDisplay

I am trying to write an iPhone application which will do some real-time camera image processing. I used the example presented in the AVFoundation docs as a starting point: setting a capture session, making a UIImage from the sample buffer data, then drawing an image at a point via -setNeedsDisplay, which I call on the main thread.

这样可以工作,但速度相当慢(每帧50 ms,测量值介于 -drawRect: 呼叫,对于192 x 144预设),我已经看到在App Store上的应用程序的工作速度比这更快。

大约一半的时间花在 -setNeedsDisplay

This works, but it is fairly slow (50 ms per frame, measured between -drawRect: calls, for a 192 x 144 preset) and I've seen applications on the App Store which work faster than this.
About half of my time is spent in -setNeedsDisplay.

如何加速此图片处理?

推荐答案

史蒂夫指出,在我的回答这里我鼓励人们看看OpenGL ES在处理和从iPhone的相机将图像显示到屏幕时获得最佳性能。这样做的原因是,使用Quartz不断地将UIImage更新到屏幕上是一种相当缓慢的方式将原始像素数据发送到显示器。

As Steve points out, in my answer here I encourage people to look at OpenGL ES for the best performance when processing and rendering images to the screen from the iPhone's camera. The reason for this is that using Quartz to continually update a UIImage onto the screen is a fairly slow way to send raw pixel data to the display.

如果可能,我鼓励你要看看OpenGL ES做你的实际处理,因为如何调整的GPU是这种工作。如果你需要保持OpenGL ES 1.1的兼容性,你的处理选项比2.0的可编程着色器有限得多,但你仍然可以做一些基本的图像调整。

If possible, I encourage you to look to OpenGL ES to do your actual processing, because of how well-tuned GPUs are for this kind of work. If you need to maintain OpenGL ES 1.1 compatibility, your processing options are much more limited than with 2.0's programmable shaders, but you can still do some basic image adjustment.

即使你使用CPU上的原始数据进行所有的图像处理,对于图像数据使用OpenGL ES纹理仍然会更好,每个帧更新。

Even if you're doing all of your image processing using the raw data on the CPU, you'll still be much better off by using an OpenGL ES texture for the image data, updating that with each frame. You'll see a jump in performance just by switching to that rendering route.

(更新:2/18/2012)正如我在更新上述内容时所描述的,链接答案,我使我的新的开源 GPUImage 框架更容易这一过程。这将处理所有的OpenGL ES交互,所以你可以只关注应用过滤器和其他效果,你想在传入的视频。它的速度比使用CPU限制例程和手动显示更新的处理快5-70倍。

(Update: 2/18/2012) As I describe in my update to the above-linked answer, I've made this process much easier with my new open source GPUImage framework. This handles all of the OpenGL ES interaction for you, so you can just focus on applying the filters and other effects that you'd like to on your incoming video. It's anywhere from 5-70X faster than doing this processing using CPU-bound routines and manual display updates.

这篇关于如何从iPhone相机进行快速图像处理?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆