如何将来自iPhone相机的实时视频馈送转换为灰度? [英] How do I convert the live video feed from the iPhone camera to grayscale?
问题描述
我如何从iPhone相机拍摄现场画面,将它们转换为灰度,然后在我的应用程序中将其显示在屏幕上?
How would I take the live frames from the iPhone camera, convert them to grayscale, and then display them on the screen in my application?
推荐答案
要扩展汤米说,你会想使用AVFoundation在iOS 4.0中捕获实时相机帧。然而,我建议直接使用OpenGL来做图像处理,因为你不能在当前硬件上实现实时结果。
To expand upon what Tommy said, you'll want to use AVFoundation in iOS 4.0 to capture the live camera frames. However, I'd recommend using OpenGL directly to do the image processing because you won't be able to achieve realtime results on current hardware otherwise.
对于OpenGL ES 1.1设备,我将看看使用Apple的 GLImageProcessing 示例应用程序作为基础(它有一个OpenGL灰度过滤器),并通过它运行你的直播视频帧。
For OpenGL ES 1.1 devices, I'd look at using Apple's GLImageProcessing sample application as a base (it has an OpenGL greyscale filter within it) and running your live video frames through that.
对于OpenGL ES 2.0,使用可编程着色器来实现这种效果。我将介绍如何通过此示例应用程序中的各种过滤器处理实时iPhone摄像头数据使用着色器,以及如何工作这里。
For OpenGL ES 2.0, you might want to use a programmable shader to achieve this effect. I show how to process live iPhone camera data through various filters in this sample application using shaders, with a writeup on how that works here.
在我的基准测试中,iPhone 4可以使用可编程着色器在60 FPS下进行这种处理,