android ffmpeg opengl es 渲染电影 [英] android ffmpeg opengl es render movie

查看:20
本文介绍了android ffmpeg opengl es 渲染电影的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试通过 NDK 渲染视频,以添加一些 sdk 中不支持的功能.我正在使用 FFmpeg 解码视频并可以通过 ndk 编译它,并使用 this 作为起点.我已经修改了该示例,而不是使用 glDrawTexiOES 来绘制纹理,而是设置了一些顶点并在其之上渲染纹理(opengl es 渲染四边形的方式).

I am trying to render video via the NDK, to add some features that just aren't supported in the sdk. I am using FFmpeg to decode the video and can compile that via the ndk, and used this as a starting point. I have modified that example and instead of using glDrawTexiOES to draw the texture I have setup some vertices and am rendering the texture on top of that (opengl es way of rendering quad).

下面是我正在做的渲染,但创建 glTexImage2D 很慢.我想知道是否有任何方法可以加快速度,或者给出加快速度的外观,例如尝试在背景中设置一些纹理并渲染预设置纹理.或者是否有任何其他方法可以更快地将视频帧绘制到android中的屏幕上?目前我只能得到大约 12fps.

Below is what I am doing to render, but creating the glTexImage2D is slow. I want to know if there is any way to speed this up, or give the appearance of speeding this up, such as trying to setup some textures in the background and render pre-setup textures. Or if there is any other way to more quickly draw the video frames to screen in android? Currently I can only get about 12fps.

glClear(GL_COLOR_BUFFER_BIT);
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glBindTexture(GL_TEXTURE_2D, textureConverted);

//this is slow
glTexImage2D(GL_TEXTURE_2D, /* target */
0, /* level */
GL_RGBA, /* internal format */
textureWidth, /* width */
textureHeight, /* height */
0, /* border */
GL_RGBA, /* format */
GL_UNSIGNED_BYTE,/* type */
pFrameConverted->data[0]);

glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glTexCoordPointer(2, GL_FLOAT, 0, texCoords);
glVertexPointer(3, GL_FLOAT, 0, vertices);
glDrawElements(GL_TRIANGLES, 6, GL_UNSIGNED_BYTE, indices);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);

编辑我更改了我的代码,只初始化了一次 gltextImage2D,并使用 glSubTexImage2D 对其进行了修改,但对帧率并没有太大的改善.

EDIT I changed my code to initialize a gltextImage2D only once, and modify it with glSubTexImage2D, it didn't make much of an improvement to the framerate.

然后我修改了代码以修改 NDK 上的本机 Bitmap 对象.使用这种方法,我有一个后台线程,它运行处理下一帧并在本机端填充位图对象.我认为这有潜力,但我需要提高将 AVFrame 对象从 FFmpeg 转换为本机位图的速度.以下是我目前用来转换的方法,一种蛮力方法.有什么方法可以提高这个速度或优化这个转换?

I then modified the code to modify a native Bitmap object on the NDK. With this approach I have a background thread that runs that process the next frames and populates the bitmap object on the native side. I think this has potential, but I need to get the speed increased of converting the AVFrame object from FFmpeg into a native bitmap. Below is currently what I am using to convert, a brute force approach. Is there any way to increase the speed of this or optimize this conversion?

static void fill_bitmap(AndroidBitmapInfo*  info, void *pixels, AVFrame *pFrame)
{
uint8_t *frameLine;

int  yy;
for (yy = 0; yy < info->height; yy++) {
    uint8_t*  line = (uint8_t*)pixels;
    frameLine = (uint8_t *)pFrame->data[0] + (yy * pFrame->linesize[0]);

    int xx;
    for (xx = 0; xx < info->width; xx++) {
        int out_offset = xx * 4;
        int in_offset = xx * 3;

        line[out_offset] = frameLine[in_offset];
        line[out_offset+1] = frameLine[in_offset+1];
        line[out_offset+2] = frameLine[in_offset+2];
        line[out_offset+3] = 0;
    }
    pixels = (char*)pixels + info->stride;
}
}

推荐答案

是的,纹理(以及缓冲区、着色器和帧缓冲区)创建速度很慢.

Yes, texture (and buffer, and shader, and framebuffer) creation is slow.

这就是为什么你应该只创建一次纹理.创建完成后,您可以通过调用 glSubTexImage2D.

That's why you should create texture only once. After it is created, you can modify its data by calling glSubTexImage2D.

为了更快速地上传纹理数据 - 创建两个纹理.当您使用一个显示时,将纹理数据从 ffmpeg 上传到第二个.当您显示第二个时,将数据上传到第一个.并从头开始重复.

And to make uploading texture data more faster - create two textures. While you use one to display, upload texture data from ffmpeg to second one. When you display second one, upload data to first one. And repeat from beginning.

我认为它仍然不会很快.您可以尝试使用允许从 NDK 访问位图对象像素的 jnigraphics 库.之后 - 您只需在 java 端的屏幕上显示此位图.

I think it will still be not very fast. You could try to use jnigraphics library that allows to access Bitmap object pixels from NDK. After that - you just diplay this Bitmap on screen on java side.

这篇关于android ffmpeg opengl es 渲染电影的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆