iOS 5.0中的核心图像过滤器是否足够快以进行实时视频处理? [英] Are the Core Image filters in iOS 5.0 fast enough for realtime video processing?

查看:145
本文介绍了iOS 5.0中的核心图像过滤器是否足够快以进行实时视频处理?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

既然Apple已将Core Image框架移植到iOS 5.0,我想知道:Core Image是否足够快,可以对摄像机视频应用实时滤镜和效果?

Now that Apple has ported the Core Image framework over to iOS 5.0, I'm wondering: is Core Image is fast enough to apply live filters and effects to camera video?

另外,学习iOS 5.0核心图像框架的好起点是什么?

Also, what would be a good starting point to learn the Core Image framework for iOS 5.0?

推荐答案

现在核心图片已经在iOS上出现了一段时间,我们可以谈谈一些硬性能数字。我创建了一个基准测试应用程序,作为我的 GPUImage 框架测试的一部分,并分析了原始CPU的性能 - 基于过滤器,核心图像过滤器和带有实时视频源的GPUImage过滤器。以下是每个从iPhone相机在640x480视频帧上应用单个伽马滤波器的时间(以毫秒为单位)(对于运行两个不同OS版本的两种不同硬件型号):

Now that Core Image has been out on iOS for a while, we can talk about some hard performance numbers. I created a benchmark application as part of the testing for my GPUImage framework, and profiled the performance of raw CPU-based filters, Core Image filters, and GPUImage filters with live video feeds. The following were the times (in milliseconds) each took to apply a single gamma filter on a 640x480 video frame from the iPhone's camera (for two different hardware models running two different OS versions):

             iPhone 4 (iOS 5)   | iPhone 4S (iOS 6)
------------------------------------------------
CPU          458 ms (2.2 FPS)     183 ms (5.5 FPS)
Core Image   106 ms (6.7 FPS)     8.2 ms (122 FPS)
GPUImage     2.5 ms (400 FPS)     1.8 ms (555 FPS)

对于Core Image,最多可转换为9.4适用于iPhone 4上的简单伽马滤镜的FPS,但iPhone 4S上的相同功率超过60 FPS。这是关于您可以设置的最简单的Core Image过滤器案例,因此性能肯定会随着更复杂的操作而变化。这似乎表明Core Image无法以足够快的速度进行实时处理以匹配运行iOS 5的iPhone 4上的iPhone相机速率,但是从iOS 6开始,它处理的视频速度超过iPhone 4S及以上版本的实时过滤速度。 。

For Core Image, this translates into a maximum of 9.4 FPS for a simple gamma filter on iPhone 4, but well over 60 FPS for the same on an iPhone 4S. This is about the simplest Core Image filter case you can set up, so performance will certainly vary with more complex operations. This would seem to indicate that Core Image cannot do live processing fast enough to match the iPhone's camera rate on the iPhone 4 running iOS 5, but as of iOS 6, it processes video more than fast enough to do live filtering on iPhone 4S and above.

这些基准的来源可以在我的GitHub中找到存储库),如果你想知道我从哪里获得这些数字。

The source for these benchmarks can be found in my GitHub repository), if you wish to see where I got these numbers from.

我已经从我的原始版本更新了这个答案,这对Core太过批评了图像的表现。我用作比较基础的棕褐色调过滤器没有执行与我自己相同的操作,所以它是一个糟糕的基准。在iOS 6中,核心图像过滤器的性能也得到了显着提升,这使得它们的速度足以在iPhone 4S及更高版本上处理实时视频。此外,我已经发现了几个案例,半径模糊,其中Core Image明显优于我的GPUImage框架。

I've updated this answer from my original, which was too critical of Core Image's performance. The sepia tone filter I was using as a basis of comparison was not performing the same operation as my own, so it was a poor benchmark. The performance of Core Image filters also improved significantly in iOS 6, which helped make them more than fast enough to process live video on iPhone 4S and up. Also, I've since found several cases, like large-radius blurs, where Core Image significantly outperforms my GPUImage framework.

上一个答案,后代:

与任何与性能相关的问题一样,答案取决于过滤器的复杂程度,过滤的图像大小以及正在运行的设备的性能特征。

As with any performance-related question, the answer will depend on the complexity of your filters, the image size being filtered, and the performance characteristics of the device you're running on.

由于Core Image已在Mac上推出一段时间了,我可以指向核心图像编程指南作为学习框架的资源。鉴于NDA,我无法评论特定于iOS的元素,但我强烈建议您观看视频 WWDC 2011 会议422 - 在iOS和Mac OS X上使用核心图像。

Because Core Image has been available for a while on the Mac, I can point you to the Core Image Programming Guide as a resource for learning the framework. I can't comment on the iOS-specific elements, given the NDA, but I highly recommend watching the video for WWDC 2011 Session 422 - Using Core Image on iOS and Mac OS X.

核心图像(大部分)使用GPU进行图像处理,所以你可以看看OpenGL ES 2.0着色器在现有设备上处理图像处理的速度有多快。我最近在这个领域做了一些工作,发现iPhone 4可以使用简单的着色器对480 x 320进行实时视频进行60 FPS处理。您可以在那里下载我的示例应用程序并尝试自定义着色器和/或视频输入大小确定您的特定设备是否能以合适的帧速率处理此处理。 Core Image可能会增加一些开销,但它也有一些巧妙的优化方式来组织过滤器链。

Core Image (mostly) uses the GPU for image processing, so you could look at how fast OpenGL ES 2.0 shaders handle image processing on existing devices. I did some work in this area recently, and found that the iPhone 4 could do 60 FPS processing using a simple shader on realtime video being fed in at a 480 x 320. You could download my sample application there and attempt to customize the shader and / or video input size to determine if your particular device could handle this processing at a decent framerate. Core Image may add a little overhead, but it also has some clever optimizations for how it organizes filter chains.

最慢的兼容设备将是iPhone 3G S和第三代iPod touch,但它们并不比iPhone 4慢得多.iPad 2以其巨大的碎片处理能力将它们全部吹走。

The slowest compatible devices out there would be the iPhone 3G S and the 3rd generation iPod touch, but they're not that much slower than the iPhone 4. The iPad 2 blows them all away with its massive fragment processing power.

这篇关于iOS 5.0中的核心图像过滤器是否足够快以进行实时视频处理?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆