延迟流水线Gstremer视频流 [英] pipeline Gstremer video streaming with delay

查看:211
本文介绍了延迟流水线Gstremer视频流的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在将经过解复用的,经过h264解码的输出发送到gstreamer管道中的autovideosink之前,是否可以在这之间提供一定的延迟.如果是这样,任何人都可以发布示例管道来执行此操作. 我使用的管道是 udpsrc端口= 5000! mpegtsdemux名称= demux!队列 ! ffdec_h264! ffmpegcolorspace! autovideosink多路分配器. !队列 ! ffdec_mp3!音频转换! alsasink多路分配器

Is it possible to give some delay in between before sending demuxed, h264-decoded output to autovideosink in gstreamer pipeline. If so can anybody post sample pipeline to do that. The pipeline which I used is udpsrc port=5000 ! mpegtsdemux name=demux ! queue ! ffdec_h264 ! ffmpegcolorspace ! autovideosink demux. ! queue ! ffdec_mp3 ! audioconvert ! alsasink demux

在这种情况下,一旦在upd-port 5000处接收到流,它将在解复用队列解码之后立即开始播放.将其发送到实际播放的autovideosink之前有60秒的延迟,是否有任何可能性.是否有任何Gstreamer插件/元素可以做到这一点.

In this case once the stream is received at upd-port 5000 it will immediately start playing after demuxing-queuing-decoding. Is there any-possibilty of delay say 60sec befoe sending it to autovideosink where it is actually played.Is there any Gstreamer plugin/element to do that.

推荐答案

您可能想看看queue的参数(运行gst-inspect queue):

You might want look at queue's parameters (run gst-inspect queue):

max-size-buffers    : Max. number of buffers in the queue (0=disable)
                      flags: lesbar, schreibbar
                      Unsigned Integer. Range: 0 - 4294967295 Default: 200
max-size-bytes      : Max. amount of data in the queue (bytes, 0=disable)
                      flags: lesbar, schreibbar
                      Unsigned Integer. Range: 0 - 4294967295 Default: 10485760
max-size-time       : Max. amount of data in the queue (in ns, 0=disable)
                      flags: lesbar, schreibbar
                      Unsigned Integer64. Range: 0 - 18446744073709551615 Default: 1000000000
min-threshold-buffers: Min. number of buffers in the queue to allow reading (0=disable)
                      flags: lesbar, schreibbar
                      Unsigned Integer. Range: 0 - 4294967295 Default: 0
min-threshold-bytes : Min. amount of data in the queue to allow reading (bytes, 0=disable)
                      flags: lesbar, schreibbar
                      Unsigned Integer. Range: 0 - 4294967295 Default: 0
min-threshold-time  : Min. amount of data in the queue to allow reading (in ns, 0=disable)
                      flags: lesbar, schreibbar
                      Unsigned Integer64. Range: 0 - 18446744073709551615 Default: 0

通过设置min-threshold-time,您可以将输出延迟n纳秒.
我刚刚使用网络摄像头进行了尝试,并且可以正常工作(延迟60秒):

By setting min-threshold-time you can delay the output by n nanoseconds.
I've just tried that out with my webcam and it worked (60secs delay):

gst-launch v4l2src ! queue max-size-buffers=0 max-size-time=0 max-size-bytes=0 min-threshold-time=60000000000 ! autovideosink

请注意,我已将max-size-*参数设置为0,因为如果在达到阈值之前队列已满,则不会从队列中获取数据.

Note that I've set the max-size-* parameters to 0 because if the queue fills up before the threshold is met, you won't get data out the queue.

请记住,对解码的视频流进行排队可能会导致大量的内存使用. 对于您编码的udpsrc,我建议您延迟编码的h264流.您可能需要以字节为单位设置阈值,而不是以纳秒为单位设置(我认为队列对编码数据的了解不足,无法猜测比特率).

And keep in mind that queueing a decoded video stream might result in huge memory usage. With your encoded udpsrc I'd recommend delaying the encoded h264 stream. You might need to set the threshold in bytes instead of nanoseconds (I don't think the queue knows enough about the encoded data to make a guess on the bitrate).

这篇关于延迟流水线Gstremer视频流的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆