SurfaceTexture 的 onFrameAvailable() 方法总是调用太晚 [英] SurfaceTexture's onFrameAvailable() method always called too late

查看:21
本文介绍了SurfaceTexture 的 onFrameAvailable() 方法总是调用太晚的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用以下 MediaExtractor 示例:

I'm trying to get the following MediaExtractor example to work:

http://bigflake.com/mediacodec/- ExtractMpegFramesTest.java(需要 4.1,API 16)

我遇到的问题是 outputSurface.awaitNewImage();似乎总是抛出 RuntimeException("frame wait timed out"),只要 mFrameSyncObject.wait(TIMEOUT_MS) 调用超时就会抛出.无论我将 TIMEOUT_MS 设置为什么,onFrameAvailable() 总是在发生超时后立即调用.我尝试了 50 毫秒和 30000 毫秒,结果是一样的.

The problem I have is that outputSurface.awaitNewImage(); seems to always throw RuntimeException("frame wait timed out"), which is thrown whenever the mFrameSyncObject.wait(TIMEOUT_MS) call times out. No matter what I set TIMEOUT_MS to be, onFrameAvailable() always gets called right after the timeout occurs. I tried with 50ms and with 30000ms and it's the same.

好像onFrameAvailable()调用不能在线程忙的时候完成,一旦超时导致线程代码执行结束,就可以解析onFrameAvailable() 调用.

It seems like the onFrameAvailable() call can't be done while the thread is busy, and once the timeout happens which ends the thread code execution, it can parse the onFrameAvailable() call.

有没有人设法让这个示例工作,或者知道 MediaExtractor 应该如何处理 GL 纹理?

Has anyone managed to get this example to work, or knows how MediaExtractor is supposed to work with GL textures?

在使用 API 4.4 和 4.1.1 的设备上进行了尝试,两者都发生了同样的情况.

tried this on devices with API 4.4 and 4.1.1 and the same happens on both.

编辑 2:

感谢 fadden,让它在 4.4 上运行.问题是调用 th.join();ExtractMpegFramesWrapper.runTest() 方法阻塞了主线程并阻止了 onFrameAvailable()正在处理的呼叫.一旦我评论了 th.join(); 它就可以在 4.4 上运行.我想也许 ExtractMpegFramesWrapper.runTest() 本身应该在另一个线程上运行,所以主线程没有被阻塞.

Got it working on 4.4 thanks to fadden. The issue was that the ExtractMpegFramesWrapper.runTest() method called th.join(); which blocked the main thread and prevented the onFrameAvailable() call from being processed. Once I commented th.join(); it works on 4.4. I guess maybe the ExtractMpegFramesWrapper.runTest() itself was supposed to run on yet another thread so the main thread didn't get blocked.

在4.1.2调用codec.configure()时也出现了一个小问题,报错:

There was also a small issue on 4.1.2 when calling codec.configure(), it gave the error:

A/ACodec(2566): frameworks/av/media/libstagefright/ACodec.cpp:1041 CHECK(def.nBufferSize >= size) failed.
A/libc(2566): Fatal signal 11 (SIGSEGV) at 0xdeadbaad (code=1), thread 2625 (CodecLooper)

我通过在通话前添加以下内容解决了这个问题:

Which I solved by adding the following before the call:

format.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 0);

但是,我现在在 4.1.1 (Galaxy S2 GT-I9100) 和 4.1.2 (Samsung Galaxy Tab GT-P3110) 上遇到的问题是它们总是将所有帧的 info.size 设置为 0.这是日志输出:

However the problem I have now on both 4.1.1 (Galaxy S2 GT-I9100) and 4.1.2 (Samsung Galaxy Tab GT-P3110) is that they both always set info.size to 0 for all frames. Here is the log output:

loop
input buffer not available
no output from decoder available
loop
input buffer not available
no output from decoder available
loop
input buffer not available
no output from decoder available
loop
input buffer not available
no output from decoder available
loop
submitted frame 0 to dec, size=20562
no output from decoder available
loop
submitted frame 1 to dec, size=7193
no output from decoder available
loop
[... skipped 18 lines ...]
submitted frame 8 to dec, size=6531
no output from decoder available
loop
submitted frame 9 to dec, size=5639
decoder output format changed: {height=240, what=1869968451, color-format=19, slice-height=240, crop-left=0, width=320, crop-bottom=239, crop-top=0, mime=video/raw, stride=320, crop-right=319}
loop
submitted frame 10 to dec, size=6272
surface decoder given buffer 0 (size=0)
loop
[... skipped 1211 lines ...]
submitted frame 409 to dec, size=456
surface decoder given buffer 1 (size=0)
loop
sent input EOS
surface decoder given buffer 0 (size=0)
loop
surface decoder given buffer 1 (size=0)
loop
surface decoder given buffer 0 (size=0)
loop
surface decoder given buffer 1 (size=0)
loop
[... skipped 27 lines all with size=0 ...]
surface decoder given buffer 1 (size=0)
loop
surface decoder given buffer 0 (size=0)
output EOS
Saving 0 frames took ? us per frame // edited to avoid division-by-zero error

所以没有图像被保存.然而,相同的代码和视频适用于 4.3.我使用的视频是带有H264 - MPEG-4 AVC (avc1)"视频编解码器和MPEG AAAC Audio (mp4a)"音频编解码器的 .mp4 文件.

So no images get saved. However the same code and video works on 4.3. The video I am using is an .mp4 file with "H264 - MPEG-4 AVC (avc1)" video codec and "MPEG AAAC Audio (mp4a)" audio codec.

我也尝试过其他视频格式,但它们在 4.1.x 上似乎死得更快,而在 4.3 上它们都可以工作.

I also tried other video formats, but they seem to die even sooner on 4.1.x, while both work on 4.3.

编辑 3:

我按照你的建议做了,它似乎正确保存了帧图像.谢谢.

I did as you suggested, and it seems to save the frame images correctly. Thank you.

关于KEY_MAX_INPUT_SIZE,我试过不设置,或者设置为0、20、200、...200000000,结果都是info.size=0.

Regarding KEY_MAX_INPUT_SIZE, I tried not setting, or setting it to 0, 20, 200, ... 200000000, all with the same result of info.size=0.

我现在无法在我的布局上将渲染设置为 SurfaceView 或 TextureView.我尝试替换这一行:

I am now unable to set the render to a SurfaceView or TextureView on my layout. I tried replacing this line:

mSurfaceTexture = new SurfaceTexture(mTextureRender.getTextureId());

有了这个,其中 surfaceTexture 是我的 xml-layout 中定义的 SurfaceTexture:

with this, where surfaceTexture is a SurfaceTexture defined in my xml-layout:

mSurfaceTexture = textureView.getSurfaceTexture();
mSurfaceTexture.attachToGLContext(mTextureRender.getTextureId());

但它在第二行使用 getMessage()==null 引发了一个奇怪的错误.我找不到任何其他方法让它在某种视图上绘制.如何更改解码器以在 Surface/SurfaceView/TextureView 上显示帧而不是保存它们?

but it throws a weird error with getMessage()==null on the second line. I couldn't find any other way to get it to draw on a View of some kind. How can I change the decoder to display the frames on a Surface/SurfaceView/TextureView instead of saving them?

推荐答案

SurfaceTexture 的工作方式使得这有点棘手.

The way SurfaceTexture works makes this a bit tricky to get right.

docs 说帧可用回调在任意线程上调用".SurfaceTexture 类有一些代码在初始化时执行以下操作(第 318 行):

The docs say the frame-available callback "is called on an arbitrary thread". The SurfaceTexture class has a bit of code that does the following when initializing (line 318):

if (this thread has a looper) {
    handle events on this thread
} else if (there's a "main" looper) {
    handle events on the main UI thread
} else {
    no events for you
}

帧可用事件通过通常的 Looper/Handler 机制传递到您的应用程序.该机制只是一个消息队列,这意味着线程需要坐在 Looper 事件循环中等待它们到达.问题是,如果你在 awaitNewImage() 中睡觉,你并没有看到 Looper 队列.所以事件到了,但没有人看到它.最终 awaitNewImage() 超时,线程返回查看事件队列,并立即发现待处理的新帧"消息.

The frame-available events are delivered to your app through the usual Looper / Handler mechanism. That mechanism is just a message queue, which means the thread needs to be sitting in the Looper event loop waiting for them to arrive. The trouble is, if you're sleeping in awaitNewImage(), you're not watching the Looper queue. So the event arrives, but nobody sees it. Eventually awaitNewImage() times out, and the thread returns to watching the event queue, where it immediately discovers the pending "new frame" message.

因此,诀窍是确保帧可用事件到达与位于 awaitNewImage() 中的线程不同的线程.在 ExtractMpegFramesTest 示例中,这是通过在新创建的线程中运行测试来完成的(请参阅 ExtractMpegFramesWrapper 类),它没有 Looper.(由于某种原因,执行 CTS 测试的线程有一个循环器.)帧可用事件到达主 UI 线程.

So the trick is to make sure that frame-available events arrive on a different thread from the one sitting in awaitNewImage(). In the ExtractMpegFramesTest example, this is done by running the test in a newly-created thread (see the ExtractMpegFramesWrapper class), which does not have a Looper. (For some reason the thread that executes CTS tests has a looper.) The frame-available events arrive on the main UI thread.

更新(针对edit 3"):我有点难过忽略size"字段有所帮助,但在 4.3 之前,很难预测设备的行为方式.

Update (for "edit 3"): I'm a bit sad that ignoring the "size" field helped, but pre-4.3 it's hard to predict how devices will behave.

如果您只是想要显示框架,请传递您从 SurfaceViewTextureView 获得的 Surface进入 MediaCodec 解码器 configure() 调用.然后你就不用弄乱 SurfaceTexture 了——当你解码它们时,帧就会显示出来.有关示例,请参见 Grafika 中的两个播放视频"活动.

If you just want to display the frame, pass the Surface you get from the SurfaceView or TextureView into the MediaCodec decoder configure() call. Then you don't have to mess with SurfaceTexture at all -- the frames will be displayed as you decode them. See the two "Play video" activities in Grafika for examples.

如果你真的想通过 SurfaceTexture,你需要改变 CodecOutputSurface 来渲染到窗口表面而不是 pbuffer.(屏幕外渲染已完成,因此我们可以在无头测试中使用 glReadPixels().)

If you really want to go through a SurfaceTexture, you need to change CodecOutputSurface to render to a window surface rather than a pbuffer. (The off-screen rendering is done so we can use glReadPixels() in a headless test.)

这篇关于SurfaceTexture 的 onFrameAvailable() 方法总是调用太晚的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆