是否可以在 iOS 中使用视频作为 GL 的纹理? [英] Is it possible using video as texture for GL in iOS?

查看:24
本文介绍了是否可以在 iOS 中使用视频作为 GL 的纹理?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

是否可以在 iOS 中使用视频(预渲染,使用 H.264 压缩)作为 GL 的纹理?

Is it possible using video (pre-rendered, compressed with H.264) as texture for GL in iOS?

如果可以,怎么做?以及任何播放质量/帧速率或限制?

If possible, how to do it? And any playback quality/frame-rate or limitations?

推荐答案

从 iOS 4.0 开始,您可以使用 AVCaptureDeviceInput 将相机作为设备输入并将其连接到 AVCaptureVideoDataOutput 将您喜欢的任何对象设置为委托.通过为相机设置 32bpp BGRA 格式,委托对象将从相机接收每一帧,格式非常适合立即传递给 glTexImage2D(或 glTexSubImage2D,如果设备不支持非二次幂纹理;我认为 MBX 设备属于这一类).

As of iOS 4.0, you can use AVCaptureDeviceInput to get the camera as a device input and connect it to a AVCaptureVideoDataOutput with any object you like set as the delegate. By setting a 32bpp BGRA format for the camera, the delegate object will receive each frame from the camera in a format just perfect for handing immediately to glTexImage2D (or glTexSubImage2D if the device doesn't support non-power-of-two textures; I think the MBX devices fall into this category).

有一堆帧大小和帧速率选项;猜测你必须根据你想使用 GPU 的其他多少来调整这些.我发现一个完全微不足道的场景,只有一个带纹理的四边形显示最新帧,仅在新帧到达 iPhone 4 时准确重绘,能够显示该设备的最大 720p 24fps 馈送而没有任何明显的延迟.我还没有进行过比这更彻底的基准测试,所以希望其他人可以提供建议.

There are a bunch of frame size and frame rate options; at a guess you'll have to tweak those depending on how much else you want to use the GPU for. I found that a completely trivial scene with just a textured quad showing the latest frame, redrawn only exactly when a new frame arrives on an iPhone 4, was able to display that device's maximum 720p 24fps feed without any noticeable lag. I haven't performed any more thorough benchmarking than that, so hopefully someone else can advise.

原则上,根据 API,帧可以通过扫描线之间的一些内存填充返回,这意味着在发布到 GL 之前对内容进行一些改组,因此您确实需要为此实现代码路径.在实践中,纯粹凭经验来说,似乎当前版本的 iOS 永远不会以这种形式返回图像,因此这并不是真正的性能问题.

In principle, per the API, frames can come back with some in-memory padding between scanlines, which would mean some shuffling of contents before posting off to GL so you do need to implement a code path for that. In practice, speaking purely empirically, it appears that the current version of iOS never returns images in that form so it isn't really a performance issue.

现在非常接近三年后.在此期间,Apple 发布了 iOS 5、6 和 7.在 iOS 5 中,他们引入了 CVOpenGLESTextureCVOpenGLESTextureCache,它们现在是将视频从捕获设备传输到 OpenGL 的智能方式.Apple 在此处提供示例代码,其中特别有趣的部分是 RippleViewController.m,特别是它的 setupAVCapturecaptureOutput:didOutputSampleBuffer:fromConnection: — 见第 196-329 行.遗憾的是,条款和条件在不附加整个项目的情况下阻止了此处的代码重复,但分步设置是:

it's now very close to three years later. In the interim Apple has released iOS 5, 6 and 7. With 5 they introduced CVOpenGLESTexture and CVOpenGLESTextureCache, which are now the smart way to pipe video from a capture device into OpenGL. Apple supplies sample code here, from which the particularly interesting parts are in RippleViewController.m, specifically its setupAVCapture and captureOutput:didOutputSampleBuffer:fromConnection: — see lines 196–329. Sadly the terms and conditions prevent a duplication of the code here without attaching the whole project but the step-by-step setup is:

  1. 创建一个 CVOpenGLESTextureCacheCreate 和一个 AVCaptureSession;
  2. 为视频获取合适的AVCaptureDevice
  3. 使用该捕获设备创建一个 AVCaptureDeviceInput
  4. 附加一个 AVCaptureVideoDataOutput 并告诉它作为示例缓冲区委托调用您.
  1. create a CVOpenGLESTextureCacheCreate and an AVCaptureSession;
  2. grab a suitable AVCaptureDevice for video;
  3. create an AVCaptureDeviceInput with that capture device;
  4. attach an AVCaptureVideoDataOutput and tell it to call you as a sample buffer delegate.

收到每个样本缓冲区后:

Upon receiving each sample buffer:

  1. 从中获取CVImageBufferRef
  2. 使用 CVOpenGLESTextureCacheCreateTextureFromImage 从 CV 图像缓冲区中获取 Y 和 UV CVOpenGLESTextureRef
  3. 从 CV OpenGLES 纹理引用中获取纹理目标和名称以绑定它们;
  4. 在着色器中结合亮度和色度.
  1. get the CVImageBufferRef from it;
  2. use CVOpenGLESTextureCacheCreateTextureFromImage to get Y and UV CVOpenGLESTextureRefs from the CV image buffer;
  3. get texture targets and names from the CV OpenGLES texture refs in order to bind them;
  4. combine luminance and chrominance in your shader.

这篇关于是否可以在 iOS 中使用视频作为 GL 的纹理?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆