GMFBridge DirectShow滤镜SetLiveTiming效果 [英] GMFBridge DirectShow filter SetLiveTiming effect

查看:60
本文介绍了GMFBridge DirectShow滤镜SetLiveTiming效果的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用出色的 GMFBridge directshow过滤器系列效果非常好,我可以关闭一个录像图并打开一个新的图,而不会丢失数据.

我的原始源图是从标准视频和音频输入中捕获实时视频.

GMFBridgeController 过滤器上有一个未公开的方法,名为 SetLiveTiming().根据名称,我认为,如果像我的情况一样,是从 Live 图(而不是从文件)捕获的,则应将其设置为 true .我将此值设置为 true ,一切都按预期工作

相同的捕获硬件允许我捕获直播电视信号(在我的情况下为ATSC),因此我使用BDA体系结构过滤器创建了新版本的图形,以进行调整.一旦数据从MPEG多路分配器流出,图的其余部分实际上与我的原始图相同.

但是,在这种情况下,我的混合图形(在桥的另一侧)无法正常工作.数据从 BridgeSource 过滤器(视频和音频)流到MP4多路复用器过滤器,但是没有数据从多路复用器输出流向 FileWriter 过滤器.

几个小时后,我将问题追溯到 SetLiveTiming()设置.我关闭了,一切开始按预期进行.,并且多路复用器过滤器开始生成输出文件,但是音频未与视频同步.

有人可以启发我了解 SetLiveTiming()设置的真正目的,也许,为什么一个图启用该设置而另一个图却失败了?

更新

我设法编译了GMFBridge项目,并且似乎由于负时间戳计算,过滤器正在丢弃每个接收到的样本.但是,我对启用过滤器日志后看到的结果完全感到困惑.

更新2:删除的样本是通过我启动辅助(混合器)图的方式引入的.我使用SampleGrabber(因此在流线程中)作为触发点检查了一个样本,并使用 Task.Run() .NET调用实例化了混合器图.这以某种方式弄乱了时钟,将来我结束了参考起点"的工作-当网桥试图通过减去参考起点来修复时间戳时,它产生了一个负的时间戳-一旦我更正了这个问题并从中产生了图表应用程序线程(通过发布图形事件),此问题已解决.

很遗憾,我的多路复用视频(无论 SetLiveTiming()设置如何)仍然不同步.

我了解到当InfTee过滤器正在运行时,GMFBridge过滤器可能会出现问题使用,但是,我认为我的图形应该不会出现此问题,因为没有InfTee过滤器的实例直接连接到桥接收器.

这是我当前的源图形:

 -> [TIF]|[NetworkProvider]-> [DigitalTuner]-> [DigitalCapture]-> [demux]-|-> [Mpeg Tables]||->> [lavAudioDec]-> [tee]-> [audioConvert]-> [sampleGrabber]-> [NULL]|||||-> [aacEncoder] ----------------|| ---> [*桥水槽*]-> [VideoDecoder]-> [sampleGrabber]-> [x264Enc] -------- 

这是我的混合器图:

 视频... |桥源| --------> [MP4混合器] ---> [fileWriter]|^|音频|--------------------- 

图中的所有样本采集卡都是只读的.如果我不通过桥接来混合输出文件(通过将混合器放置在捕获图上),则输出文件将保持同步 (此结果不正确,这是不同步的问题是由H264编码器中的延迟设置引入的),但是在释放当前捕获图和运行新捕获图(具有更新的文件名)之间,我无法避免浪费几秒钟

更新3:

几天前,当我关闭x264vfw编码器中的零延迟"设置时,我不小心引入了不同步问题.我没有注意到此设置也使我已经在工作的图形不同步,我将其归咎于桥接过滤器.

总而言之,我通过以下方式搞砸了

  1. 从应用程序以外的线程启动多路复用器图线程(处理图形的事件循环的线程).

  2. 上游过滤器中的延迟开关可能正在延迟太多的东西使复用器无法跟上.

解决方案

作者评论:

 //使用此选项,您可以共享一个公共时钟//并避免任何时间映射(如果音频在多路复用图中,则必不可少)[id(13),helpstring(实时计时选项")]HRESULT SetLiveTiming([in] BOOL bIsLiveTiming); 

该方法启用一种特殊的操作模式,该模式可处理实时数据.在这种模式下,相对于各自的时钟开始时间,在图表之间转换采样时间.否则,默认模式是期望在图形变化时将时间戳重置为零.

I am using the excellent GMFBridge directshow family of filters to great effect, allowing me to close a video recording graph and open a new one, with no data-loss.

My original source graph was capturing live video from standard video and audio inputs.

There is an undocumented method on the GMFBridgeController filter named SetLiveTiming(). From the name, I figured that this should be set to true if we are capturing from a Live graph (not from a file) as is my case. I set this value to true and everything worked as expected

The same capture hardware allows me to capture live TV signals (ATSC in my case), so I created a new version of the graph using the BDA architecture filters, for tuning purposes. Once the data flows out from the MPEG demuxer, the rest of the graph is virtually the same as my original graph.

However, on this ocassion my muxing graph (on the other side of the bridge) was not working. Data flowed from the BridgeSource filter (video and audio) and reached an MP4 muxer filter, however no data was flowing from the muxer output feeding a FileWriter filter.

After several hours I traced the problem to the SetLiveTiming() setting. I turned it off and everything began working as expected. and the muxer filter began producing an output file, however, the audio was not synchronized to the video.

Can someone enlighten me on the real purpose of the SetLiveTiming() setting and perhaps, why one graph works with the setting enabled, while the other fails?

UPDATE

I managed to compile the GMFBridge Project, and it seems that the filter is dropping every received sample because of a negative timestamp computation. However I am completely baffled at the results I am seeing after enabling the filter log.

UPDATE 2: The dropped samples were introduced by the way I launched the secondary (muxer) graph. I inspected a sample using a SampleGrabber (thus inside a streaming thread) as a trigger-point and used a Task.Run() .NET call to instantiate the muxer graph. This somehow messed up the clocks and I ended having a 'reference start point' in the future - when the bridge attempted to fix the timestamp by subtracting the reference start point, it produced a negative timestamp - once I corrected this and spawned the graph from the application thread (by posting a graph event), the problem was fixed.

Unfortunately, my multiplexed video (regardless of the SetLiveTiming() setting) is still out of sync.

I read that the GMFBridge filter can have trouble when the InfTee filter is being used, however, I think that my graph shouldn't have this problem, as no instance of the InfTee filter is directly connected to the bridge sink.

Here is my current source graph:

                                                                   -->[TIF]
                                                                  |
 [NetworkProvider]-->[DigitalTuner]-->[DigitalCapture]-->[demux]--|-->[Mpeg Tables]
                                                                  |
                                                                  |-->[lavAudioDec]-->[tee]-->[audioConvert]-->[sampleGrabber]-->[NULL]
                                                                  |                        |
                                                                  |                        |
                                                                  |                         ->[aacEncoder]----------------
                                                                  |                                                       |--->[*Bridge Sink*]
                                                                   -->[VideoDecoder]-->[sampleGrabber]-->[x264Enc]--------

Here is my muxer graph:

                      video  
 ...  |bridge source|-------->[MP4 muxer]--->[fileWriter]
             |                     ^
             |        audio        |
              ---------------------

All the sample grabbers in the graph are read-only. If I mux the output file without bridging (by placing the muxer on the capture graph), the output file remains in sync, (this ended being not true, the out-of-sync problem was introduced by a latency setting in the H264 encoder) but then I can't avoid losing some seconds between releasing the current capture graph, and running the new one (with the updated file name)

UPDATE 3:

The out of sync problem was inadvertently introduced by me several days ago, when I switched off a "Zero-latency" setting in the x264vfw encoder. I hadn't noticed that this setting had desynchronized my already-working graphs too and I was blaming the bridge filter.

In summary, I screwed up things by:

  1. Launching the muxer graph from a thread other than the Application thread (the thread processing the graph's event loop).

  2. A latency switch in an upstream filter that was probably delaying things too much for the muxer to be able to keep-up.

解决方案

Author's comment:

// using this option, you can share a common clock 
// and avoid any time mapping (essential if audio is in mux graph)
[id(13), helpstring("Live Timing option")]
HRESULT SetLiveTiming([in] BOOL bIsLiveTiming);

The method enables a special mode of operation which addresses live data. In this mode sample times are converted between the graphs as relative to respective clock start times. Otherwise, the default mode is to expect reset of time stamps to zero with graph changes.

这篇关于GMFBridge DirectShow滤镜SetLiveTiming效果的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆