如何将CMSampleBuffer / UIImage转换为ffmpeg的AVPicture? [英] How to Convert CMSampleBuffer/UIImage into ffmpeg's AVPicture?
本文介绍了如何将CMSampleBuffer / UIImage转换为ffmpeg的AVPicture?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我正在尝试使用ffmpeg的libav *库将iPhone的相机帧编码为H.264视频。我发现这个
Apple的文章如何将CMSampleBuffer转换为UIImage,但是如何将其转换为ffmpeg的AVPicture?
I'm trying to encode iPhone's camera frames into a H.264 video using ffmpeg's libav* libraries. I found in this Apple's article how to convert CMSampleBuffer to UIImage, but how can I convert it to ffmpeg's AVPicture?
谢谢。
推荐答案
回答我自己的问题:
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
// access the data
int width = CVPixelBufferGetWidth(pixelBuffer);
int height = CVPixelBufferGetHeight(pixelBuffer);
unsigned char *rawPixelBase = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);
// Do something with the raw pixels here
// ...
// Fill in the AVFrame
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
AVFrame *pFrame;
pFrame = avcodec_alloc_frame();
avpicture_fill((AVPicture*)pFrame, rawPixelBase, PIX_FMT_RGB32, width, height);
现在 pFrame
已填入内容样本缓冲区,使用像素格式 kCVPixelFormatType_32BGRA
。
Now pFrame
is filled in with the content of sample buffer, which is using the pixel format kCVPixelFormatType_32BGRA
.
这解决了我的问题。谢谢。
This solved my issue. Thanks.
这篇关于如何将CMSampleBuffer / UIImage转换为ffmpeg的AVPicture?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文