如何应用“过滤器”到AVCaptureVideoPreviewLayer [英] How to apply "filters" to AVCaptureVideoPreviewLayer

查看:98
本文介绍了如何应用“过滤器”到AVCaptureVideoPreviewLayer的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的应用程序目前正在使用AVFoundation从iPhone的后置摄像头获取原始摄像头数据并实时显示在AVCaptureVideoPreviewLayer上。

My app is currently using AVFoundation to take the raw camera data from the rear camera of an iPhone and display it on an AVCaptureVideoPreviewLayer in real time.

我的目标是有条件地将简单图像过滤器应用于预览图层。图像未保存,因此我无需捕获输出。例如,我想切换一个设置,将预览图层中的视频转换为Black&白色。

My goal is to to conditionally apply simple image filters to the preview layer. The images aren't saved, so I do not need to capture the output. For example, I would like to toggle a setting that converts the video coming in on the preview layer to Black & White.

我发现了一个问题这里似乎通过捕获缓冲区中的各个视频帧,应用所需的转换,然后将每个帧显示为UIImage来实现类似的功能。由于多种原因,这对我的项目来说似乎有点过头了,我想避免这可能导致的性能问题。

I found a question here that seems to accomplish something similar by capturing the individual video frames in a buffer, applying the desired transformations, then displaying each frame as an UIImage. For several reasons, this seems like overkill for my project and I'd like to avoid any performance issues this may cause.

这是实现目标的唯一方法吗?

Is this the only way to accomplish my goal?

正如我所提到的,我不打算捕获任何AVCaptureSession的视频,只是预览它。

As I mentioned, I am not looking to capture any of the AVCaptureSession's video, merely preview it.

推荐答案

处理此问题的最佳方法可能是使用OpenGL ES过滤和显示这些视频帧。除了在与另一个视图或图层重叠时调整其不透明度之外,您将无法直接使用AVCaptureVideoPreviewLayer。

Probably the most performant way of handling this would be to use OpenGL ES for filtering and display of these video frames. You won't be able to do much with an AVCaptureVideoPreviewLayer directly, aside from adjusting its opacity when overlaid with another view or layer.

我有一个示例应用程序这里我从相机抓取帧并应用OpenGL ES 2.0着色器来处理视频实时显示。在此应用程序中(详细解释此处),我使用基于颜色的过滤来跟踪摄像机视图中的对象,但是其他人已经修改了这段代码来做一些整洁的视频处理效果。此应用程序中显示在屏幕上的所有基于GPU的过滤器在我的iPhone 4上以60 FPS运行。

I have a sample application here where I grab frames from the camera and apply OpenGL ES 2.0 shaders to process the video in realtime for display. In this application (explained in detail here), I was using color-based filtering to track objects in the camera view, but others have modified this code to do some neat video processing effects. All GPU-based filters in this application that display to the screen run at 60 FPS on my iPhone 4.

唯一支持视频的iOS设备,但没有拥有支持OpenGL ES 2.0的GPU,是iPhone 3G。如果您还需要定位该设备,您可以使用基本代码进行视频捕获和生成OpenGL ES纹理,然后使用Apple的 GLImageProcessing 示例应用程序。该应用程序是围绕OpenGL ES 1.1构建的,所有iOS设备都支持该应用程序。

The only iOS device out there that supports video, yet doesn't have an OpenGL ES 2.0 capable GPU, is the iPhone 3G. If you need to target that device as well, you might be able to take the base code for video capture and generation of OpenGL ES textures, and then use the filter code from Apple's GLImageProcessing sample application. That application is built around OpenGL ES 1.1, support for which is present on all iOS devices.

但是,我强烈建议您考虑使用OpenGL ES 2.0,因为你可以使用着色器实现比使用固定功能OpenGL ES 1.1管道更多种类的效果。

However, I highly encourage looking at the use of OpenGL ES 2.0 for this, because you can pull off many more kinds of effect using shaders than you can with the fixed function OpenGL ES 1.1 pipeline.

(编辑:2/13/2012)作为一个在上面的更新中,我现在创建了一个名为 GPUImage 的开源框架,它封装了这种自定义图像过滤。它还处理捕获视频并在过滤后将其显示在屏幕上,只需要六行代码即可设置所有这些内容。有关框架的更多信息,请参阅我更详细的公告

( 2/13/2012) As an update on the above, I've now created an open source framework called GPUImage that encapsulates this kind of custom image filtering. It also handles capturing video and displaying it to the screen after being filtered, requiring as few as six lines of code to set all of this up. For more on the framework, you can read my more detailed announcement.

这篇关于如何应用“过滤器”到AVCaptureVideoPreviewLayer的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆