是否有可能从MediaCodec获得BGR? [英] If is it possible to get BGR from MediaCodec?

查看:216
本文介绍了是否有可能从MediaCodec获得BGR?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

据我所知,MediaCodec返回YUV420中图像的输出缓冲区,然后您可以根据需要对其进行处理...但是就我而言,我需要在BGR中转换YUV420 ,这是相当昂贵的转换.

As far as I know, MediaCodec return output buffer of image in YUV420 and then you can process with it as you like to... But in my case I need to convert YUV420 in BGR, it is quite expensive conversion.

所以,问题是是否可以立即(不进行转换)从MediaCodec获取BGR?

So, question is if it is possible immediately (without conversion) get BGR from MediaCodec?

现在我的代码

uint8_t *buf = AMediaCodec_getOutputBuffer(m_data.codec, static_cast<size_t>(status), /*bufsize*/nullptr);
cv::Mat YUVframe(cv::Size(m_frameSize.width, static_cast<int>(m_frameSize.height * 1.5)), CV_8UC1, buf);

cv::Mat colImg(m_frameSize, CV_8UC3);
cv::cvtColor(YUVframe, colImg, CV_YUV420sp2BGR, 3);
auto dataSize = colImg.rows * colImg.cols * colImg.channels();
imageData.assign(colImg.data, colImg.data + dataSize);

所以,正如您在此处看到的

So, as you can see here

cv::Mat YUVframe(cv::Size(m_frameSize.width, static_cast<int>(m_frameSize.height * 1.5)), CV_8UC1, buf);

我正在从编解码器缓冲区中获取YUVframe,然后在这里

I am getting YUVframe from codec buffer and then here

cv::cvtColor(YUVframe, colImg, CV_YUV420sp2BGR, 3);

我进行转换

转换需要很多时间.对我来说,性能至关重要.

Conversion take a lot of time. For me performance is critical.

推荐答案

帧从MediaCodec中出来的格式是编解码器喜欢的任何格式.大多数数码相机和视频编解码器都可以使用YUV格式的帧.拍摄静止图像时,要求相机使用两种特定格式中的一种,但是MediaCodec几乎是免费的.如果您希望数据采用特定格式,则必须执行一些转换.

The frames come out of MediaCodec in whatever format the codec likes to work in. Most digital cameras and video codecs work with YUV format frames. The camera is required to use one of two specific formats when capturing still images, but MediaCodec is more of a free-for-all. If you want the data to be in a specific format, something has to do the conversion.

在Android上,您可以通过使用GPU获得硬件来完成此任务,GPU需要接受MediaCodec决定在该设备上生成的任何格式.通过将每个帧馈送到SurfaceTexture中,可以将每个帧锁定为外部"纹理,然后告诉GLES在pbuffer上以RGB渲染它,然后可以用不同的方式对其进行访问.有关使用glReadPixels()的示例,请参见 ExtractMpegFramesTest .

On Android you can get the hardware to do it for you by involving the GPU, which is required to accept whatever format MediaCodec decides to generate on that device. You latch each frame as an "external" texture by feeding it into a SurfaceTexture, and then tell GLES to render it in RGB on a pbuffer, which you can then access in different ways. For an example that uses glReadPixels(), see ExtractMpegFramesTest.

glReadPixels()的替代方法有赖于私有本机方法访问pbuffer(例如,),但是除非您完全控制设备,否则使用非公共API是不明智的.

There are alternatives to glReadPixels() that rely on private native methods to access the pbuffer (e.g. this and this), but using non-public APIs is unwise unless you have complete control of the device.

可能有一些更新的方法可以访问我未使用过的pbuffer,例如使用本地硬件缓冲区似乎很有用(

There may be some newer approaches for accessing the pbuffer that I haven't used, e.g. using native hardware buffers seems potentially useful (or maybe not?).

这篇关于是否有可能从MediaCodec获得BGR?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆