在Android中渲染YUV格式? [英] Rendering YUV format in android?

查看:249
本文介绍了在Android中渲染YUV格式?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想通过ffmpeg在Android中流式传输视频,解码后得到的输出是YUV格式。可以直接在Android屏幕中呈现YUV图像格式吗?



相机和硬件视频解码器的输出通常为YUV。这些来源的帧通常直接发送到显示器。它们可能由驱动程序转换,通常使用硬件缩放器和格式转换器。这对于效率是必需的。



没有一个API允许应用程序以相同的方式传递YUV框架。基本的问题是YUV涵盖了很多。视频解码器使用的缓冲器格式可以是各种硬件模块可以有效处理的专有内部格式;为了让您的应用程式以这种格式创建曲面,必须执行转换,而您就是在性能明智的地方。



您应该能够使用GLES2着色器在显示的过程中为您进行转换,但是我没有指向代码的指针。



更新: 此问题的答案有指向 WebRTC源的链接文件,表明在GLES2着色器中进行YUV转换。


I'm trying to stream video in android through ffmpeg,the output which i am getting after the decoding is YUV format.Is it possible to render YUV image format directly in the Android screen?

解决方案

Yes and no.

The output of the camera and hardware video decoders is generally YUV. Frames from these sources are generally sent directly to the display. They may be converted by the driver, typically with a hardware scaler and format converter. This is necessary for efficiency.

There isn't an API to allow an app to pass YUV frames around the same way. The basic problem is that "YUV" covers a lot of ground. The buffer format used by the video decoder may be a proprietary internal format that the various hardware modules can process efficiently; for your app to create a surface in this format, it would have to perform a conversion, and you're right back where you were performance-wise.

You should be able to use GLES2 shaders to do the conversion for you on the way to the display, but I don't have a pointer to code that demonstrates this.

Update: an answer to this question has a link to a WebRTC source file that demonstrates doing the YUV conversion in a GLES2 shader.

这篇关于在Android中渲染YUV格式?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆