如何为流媒体执行硬件加速的H.264编码和解码? [英] How can I perform hardware-accelerated H.264 encoding and decoding for streaming?

查看:656
本文介绍了如何为流媒体执行硬件加速的H.264编码和解码?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我能够从相机获取RGBA帧数据,我想用H.264格式对其进行编码。我使用FFmpeg对H.264视频进行编码和解码,但帧大小为640x480,这对我的需求来说太慢了。

I am able to get the RGBA frame data from the camera, and I want to encode it in H.264 format. I've used FFmpeg to encode and decode H.264 video, but at a frame size of 640x480 it's too slow for my needs.

我想使用硬件加速加速编码和解码,那我该怎么做呢?

I'd like to use hardware acceleration to speed up the encoding and decoding, so how would I do that?

此外,我需要能够通过网络流式传输编码视频并在另一端解码。怎么办呢?

Also, I need to be able to stream the encoded video across the network and decode it on the other end. How can this be done?

推荐答案

如果你想在iOS上做H.264视频的硬件加速视频编码和解码,唯一的出路就是AVFoundation。不要使用第三方库进行编码或解码,因为它们目前都受CPU限制,并且比从AVFoundation获得的速度慢得多。使用第三方编码器或解码器的一个原因是,如果您使用默认情况下iOS中不支持的格式。

If you want to do hardware-accelerated video encoding and decoding of H.264 video on iOS, the only way to go is AVFoundation. Don't use third-party libraries for the encoding or decoding, because they all currently are CPU-bound, and are much slower than what you get from AVFoundation. The one reason to use a third-party encoder or decoder would be if you are working with a format not supported in iOS by default.

对于硬件加速解码,您我想使用AVAssetReader实例(或其中一个播放器类进行纯播放)。使用AVAssetReader,我经常获得2倍或更高的播放速度来读取H.264编码的视频,因此iOS设备使用了一些相当不错的硬件加速。

For hardware-accelerated decoding, you'll want to use an AVAssetReader instance (or one of the player classes for pure playback). With an AVAssetReader, I regularly get 2X or higher playback speeds for reading H.264-encoded video, so the iOS devices use some pretty good hardware acceleration for that.

同样,对于加速编码,您将使用AVAssetWriter。有一些技巧可以让AVAssetWriter以最佳速度进行编码(使用像素缓冲池,使用iOS 5.0纹理缓存,如果从OpenGL ES读取,则使用BGRA帧),我在这个答案

Similarly, for accelerated encoding, you'll use an AVAssetWriter. There are some tricks to getting AVAssetWriter to encode at the best speed (feeding in BGRA frames, using a pixel buffer pool, using the iOS 5.0 texture caches if reading from OpenGL ES), which I describe in detail within this answer.

如果你想看到一些使用最快路径的代码我我们发现加速编码和解码,你可以看看我的开源 GPUImage 框架,它与一个由Anastasia链接的,完全免费使用。

If you want to see some code that uses the fastest paths I've found for accelerated encoding and decoding, you can look at my open source GPUImage framework, which, unlike the one linked by Anastasia, is totally free to use.

这篇关于如何为流媒体执行硬件加速的H.264编码和解码?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆