表面纹理的onFrameAvailable()方法总是叫为时已晚 [英] SurfaceTexture's onFrameAvailable() method always called too late

查看:3554
本文介绍了表面纹理的onFrameAvailable()方法总是叫为时已晚的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图让下面的MediaExtractor示例工作:

I'm trying to get the following MediaExtractor example to work:

http://bigflake.com/media$c$cc/ - ExtractMpegFramesTest.java(需要4.1 ,空气污染指数16)

我的问题是,outputSurface.awaitNewImage();似乎总是抛出的RuntimeException(框架等待超时),这是抛出每当 mFrameSyncObject.wait(TIMEOUT_MS)调用超时。无论我设置 TIMEOUT_MS onFrameAvailable()总是被调用的之后的发生超时。我试着用50毫秒,并与30000ms,这是一样的。

The problem I have is that outputSurface.awaitNewImage(); seems to always throw RuntimeException("frame wait timed out"), which is thrown whenever the mFrameSyncObject.wait(TIMEOUT_MS) call times out. No matter what I set TIMEOUT_MS to be, onFrameAvailable() always gets called right after the timeout occurs. I tried with 50ms and with 30000ms and it's the same.

这似乎是 onFrameAvailable()呼叫不能做到,而线程繁忙,而一旦发生超时而结束线程code执行,它可以解析 onFrameAvailable()电话。

It seems like the onFrameAvailable() call can't be done while the thread is busy, and once the timeout happens which ends the thread code execution, it can parse the onFrameAvailable() call.

有没有人设法让本示例正常工作,或知道MediaExtractor是如何应该与GL纹理工作?

Has anyone managed to get this example to work, or knows how MediaExtractor is supposed to work with GL textures?

编辑:试过这种设备上使用API​​ 4.4和4.1.1和同样的情况在两个

tried this on devices with API 4.4 and 4.1.1 and the same happens on both.

编辑2:

把它工作在4.4感谢法登。问题是,叫 ExtractMpegFramesWrapper.runTest()方法 th.join(); 其中阻塞主线程和prevented被处理的 onFrameAvailable()通话。一旦我评论 th.join(); 它适用于4.4。我想也许是 ExtractMpegFramesWrapper.runTest()本身应该运行在另一个线程,以便主线程没有得到阻止。

Got it working on 4.4 thanks to fadden. The issue was that the ExtractMpegFramesWrapper.runTest() method called th.join(); which blocked the main thread and prevented the onFrameAvailable() call from being processed. Once I commented th.join(); it works on 4.4. I guess maybe the ExtractMpegFramesWrapper.runTest() itself was supposed to run on yet another thread so the main thread didn't get blocked.

有叫 codec.configure(),它给了错误的时候也是一个小问题4.1.2:

There was also a small issue on 4.1.2 when calling codec.configure(), it gave the error:

A/ACodec(2566): frameworks/av/media/libstagefright/ACodec.cpp:1041 CHECK(def.nBufferSize >= size) failed.
A/libc(2566): Fatal signal 11 (SIGSEGV) at 0xdeadbaad (code=1), thread 2625 (CodecLooper)

这点我解决了通过添加调用之前如下:

Which I solved by adding the following before the call:

format.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 0);

不过这个问题我现在有两个4.1.1(GALAXY S2 GT-I9100)和4.1.2(三星Galaxy Tab GT-P3110)是,它们都始终设置info.size为0所有帧。这里是日志输出:

However the problem I have now on both 4.1.1 (Galaxy S2 GT-I9100) and 4.1.2 (Samsung Galaxy Tab GT-P3110) is that they both always set info.size to 0 for all frames. Here is the log output:

loop
input buffer not available
no output from decoder available
loop
input buffer not available
no output from decoder available
loop
input buffer not available
no output from decoder available
loop
input buffer not available
no output from decoder available
loop
submitted frame 0 to dec, size=20562
no output from decoder available
loop
submitted frame 1 to dec, size=7193
no output from decoder available
loop
[... skipped 18 lines ...]
submitted frame 8 to dec, size=6531
no output from decoder available
loop
submitted frame 9 to dec, size=5639
decoder output format changed: {height=240, what=1869968451, color-format=19, slice-height=240, crop-left=0, width=320, crop-bottom=239, crop-top=0, mime=video/raw, stride=320, crop-right=319}
loop
submitted frame 10 to dec, size=6272
surface decoder given buffer 0 (size=0)
loop
[... skipped 1211 lines ...]
submitted frame 409 to dec, size=456
surface decoder given buffer 1 (size=0)
loop
sent input EOS
surface decoder given buffer 0 (size=0)
loop
surface decoder given buffer 1 (size=0)
loop
surface decoder given buffer 0 (size=0)
loop
surface decoder given buffer 1 (size=0)
loop
[... skipped 27 lines all with size=0 ...]
surface decoder given buffer 1 (size=0)
loop
surface decoder given buffer 0 (size=0)
output EOS
Saving 0 frames took ? us per frame // edited to avoid division-by-zero error

因此​​,没有图像得到保存。然而,同样的code和影视作品上4​​.3。我使用的视频是一个.MP4文件H264 - MPEG-4 AVC(AVC1)。视频codeC和MPEG AAAC音频(MP4A)音频codeC

So no images get saved. However the same code and video works on 4.3. The video I am using is an .mp4 file with "H264 - MPEG-4 AVC (avc1)" video codec and "MPEG AAAC Audio (mp4a)" audio codec.

我也尝试过其他的视频格式,但他们似乎更早死在4.1.x的,而在4.3两个工作。

I also tried other video formats, but they seem to die even sooner on 4.1.x, while both work on 4.3.

修改3:

我没有如你所说,它似乎正确保存帧图像。谢谢你。

I did as you suggested, and it seems to save the frame images correctly. Thank you.

关于KEY_MAX_INPUT_SIZE,我试过不设置,或将其设置为0,20,200,... 2亿,全部用info.size相同的结果为0。

Regarding KEY_MAX_INPUT_SIZE, I tried not setting, or setting it to 0, 20, 200, ... 200000000, all with the same result of info.size=0.

我现在无法设置呈现在我的布局SurfaceView或TextureView。我试图取代这一行:

I am now unable to set the render to a SurfaceView or TextureView on my layout. I tried replacing this line:

mSurfaceTexture = new SurfaceTexture(mTextureRender.getTextureId());

本,其中表面纹理是我的XML布局定义的表面纹理:

with this, where surfaceTexture is a SurfaceTexture defined in my xml-layout:

mSurfaceTexture = textureView.getSurfaceTexture();
mSurfaceTexture.attachToGLContext(mTextureRender.getTextureId());

但它抛出一个奇怪的错误与的getMessage()== NULL 在第二行。我找不到任何其他方式得到它借鉴了查看某些类型的。我怎样才能改变德codeR显示框上的表面/ SurfaceView / TextureView而不是保存呢?

but it throws a weird error with getMessage()==null on the second line. I couldn't find any other way to get it to draw on a View of some kind. How can I change the decoder to display the frames on a Surface/SurfaceView/TextureView instead of saving them?

推荐答案

方式表面纹理工作使这是一个有点棘手得到正确的。

The way SurfaceTexture works makes this a bit tricky to get right.

借助文档说,框架可用回调之称的任意线程。该表面纹理类有一点code,它初始化时,下列(<一href="https://android.googlesource.com/platform/frameworks/base/+/kitkat-release/graphics/java/android/graphics/SurfaceTexture.java"相对=nofollow>行318 ):

The docs say the frame-available callback "is called on an arbitrary thread". The SurfaceTexture class has a bit of code that does the following when initializing (line 318):

if (this thread has a looper) {
    handle events on this thread
} else if (there's a "main" looper) {
    handle events on the main UI thread
} else {
    no events for you
}

框架可用事件通过通常的尺蠖 / 处理程序机制传递到您的应用程序。这种机制只是一个消息队列,这意味着线程需要被坐在尺蠖事件循环,等待他们的到来。麻烦的是,如果你睡在 awaitNewImage(),你不看尺蠖队列。因此,事件到达,但是没有人看到它。最后, awaitNewImage()超时,以及线程返回观看事件队列,它立刻发现了悬而未决的新框架的消息。

The frame-available events are delivered to your app through the usual Looper / Handler mechanism. That mechanism is just a message queue, which means the thread needs to be sitting in the Looper event loop waiting for them to arrive. The trouble is, if you're sleeping in awaitNewImage(), you're not watching the Looper queue. So the event arrives, but nobody sees it. Eventually awaitNewImage() times out, and the thread returns to watching the event queue, where it immediately discovers the pending "new frame" message.

因此​​,关键是要确保帧可到达的事件在不同的线程从一个坐在 awaitNewImage()。在 ExtractMpegFramesTest 的例子,这是一个新创建的线程中运行测试(请参阅 ExtractMpegFramesWrapper 类)完成,它不具有尺蠖。 (出于某种原因,执行CTS测试的线程有一个活套)的框架可用事件到达主UI线程。

So the trick is to make sure that frame-available events arrive on a different thread from the one sitting in awaitNewImage(). In the ExtractMpegFramesTest example, this is done by running the test in a newly-created thread (see the ExtractMpegFramesWrapper class), which does not have a Looper. (For some reason the thread that executes CTS tests has a looper.) The frame-available events arrive on the main UI thread.

更新(用于编辑3):我有点难过,忽略了size字段帮助,但pre-4.3,很难predict设备如何会做人。

Update (for "edit 3"): I'm a bit sad that ignoring the "size" field helped, but pre-4.3 it's hard to predict how devices will behave.

如果您的只是的要显示的帧,通过表面你的 SurfaceView TextureView 媒体codeC 德codeR 配置()通话。然后,你不必乱用表面纹理在所有 - 将会显示为c他们,你去$ C $的帧。请参阅 Grafika 两个播放视频活动的例子。

If you just want to display the frame, pass the Surface you get from the SurfaceView or TextureView into the MediaCodec decoder configure() call. Then you don't have to mess with SurfaceTexture at all -- the frames will be displayed as you decode them. See the two "Play video" activities in Grafika for examples.

如果你真的想通过一个表面纹理,您需要更改codecOutputSurface渲染到窗口表面,而不是一个pbuffer的。 (离屏渲染完成的,所以我们可以使用 glReadPixels()在一个无头的测试。)

If you really want to go through a SurfaceTexture, you need to change CodecOutputSurface to render to a window surface rather than a pbuffer. (The off-screen rendering is done so we can use glReadPixels() in a headless test.)

这篇关于表面纹理的onFrameAvailable()方法总是叫为时已晚的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆