gstreamer:在concat之后将视频流和audo流都写入单个MP4容器中 [英] gstreamer: write both video and audo streams into a single MP4 container after concat

查看:207
本文介绍了gstreamer:在concat之后将视频流和audo流都写入单个MP4容器中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

美好的一天,

我有两个mp4文件(a.mp4和b.mp4),每个文件都包含视频和音频流,我需要使用gstreamer将它们连接到单个mp4容器(c.mp4)中(此问题已连接)到上一个)

换句话说,以下管道将a.mp4和b.mp4的内容串联起来,然后将结果输出到autovideosink和alsasink中:

GST_DEBUG=3 gst-launch-1.0 concat name=c2 ! videoconvert ! videorate ! autovideosink concat name=c ! audioconvert ! audiorate ! alsasink filesrc location=a.mp4 ! decodebin name=d1 ! audio/x-raw ! queue ! c. filesrc location=b.mp4 ! decodebin name=d2 ! audio/x-raw ! queue ! c. d1. ! video/x-raw ! queue ! c2. d2. ! video/x-raw ! queue ! c2.

像魅力一样工作!但是,除了autovideosink和alsasink之外,我需要重新编码,然后将连接的视频和音频流进行多路复用,然后将它们写入单个容器中(即,如果我理解的话,管道中应该只有一个"filesink location = c.mp4"正确)-这样,我将收到a.mp4的内容以及b.mp4的内容(a.mp4 + b.mp4 = c.mp4)

有人可以共享一个演示如何执行此操作的管道吗?

解决方案

好吧,至少您提到了fileink ..但无论如何,您都应该发布自己拥有的东西(有些还不能正常工作的管道..),这是魔术管道:

gst-launch-1.0 -e concat name=c2 ! videoconvert ! x264enc tune=4 ! mp4mux name=mp4 ! filesink location=out.mp4 concat name=c ! audioconvert ! voaacenc ! mp4. filesrc location=big.mp4 ! decodebin name=d1 ! audio/x-raw ! queue ! c. filesrc location=big2.mp4 ! decodebin name=d2 ! audio/x-raw ! queue ! c. d1. ! video/x-raw ! queue ! c2. d2. ! video/x-raw ! queue ! c2.

顺便说一句,您可能想阅读有关 gst-launch

的内容>

请注意以下几点:

1,有一个用于-gst-launch的-e开关,它导致将EOS发送到正确结束mp4muxing进程的管道.否则,将不会写入元数据

2,管道不会自动结束..您可以对其进行调整..可能是concat的某些属性或其他内容..也许尝试添加streamsynchronizer-但是我没有成功.我不确定是否应该在concat之后或之前使用它.也许您可以在IRC上询问

3,如何建造这样的管道?

A,首先,我检查了mp4mux的功能,因为我知道我要混合mp4 ..(要找到此文件,如果您在Linux上,可以键入gst-inspect-1.0 | grep mp4 | grep mux).您必须记住, src是输出元素和接收器>是输入(有时这样想有时不是很自然..请记住,接收器"是指您洗手时水结束的地方:D).因此,我们希望有音频接收器和视频接收器.

B,我选择音频/mpeg的可能性很小,然后我想起mp4有时会使用aac ..然后我搜索了voaacenc的aac编码器. mpeg(版本4)

C,对于视频,有我最喜欢的video/x-h264 ..所以我拿了好老的x264enc,我一直在用它来录制视频..我以为也许我需要h264parse,但不需要.

4,然后将所有内容捆绑在一起..请记住,您可以使用name=something(与别名完全相同)来给元素命名,但您不只是将其用作something,而是需要点之后,它的something.

5,还请记住,将元素放入gst-launch的顺序比链接的处理更重要..考虑一下,您所需要做的就是链接元素..然后考虑一下处理本身.您可以将* sink元素放在管道的开头,但是您必须给它们命名并在其他地方使用该名称(例如,像我一样在mux之后)

6,为了简化它..输入处理,然后是decodebin,它产生两个分支-音频和视频..每种类型都转到适当的concat ..有两个concats-每个都有自己的处理类型(对于视频,有videoconvert等).然后这两个concat分支经过编码,编码后它们以mp4mux结尾.

Good day,

I have two mp4 files (a.mp4 and b.mp4), each of them includes video and audio streams, and I need to concatenate them into a single mp4 container (c.mp4) using gstreamer (this question is connected to the previous one)

In other words, the following pipeline concatenates the content of a.mp4 and b.mp4 and then outputs the result into autovideosink and alsasink:

GST_DEBUG=3 gst-launch-1.0 concat name=c2 ! videoconvert ! videorate ! autovideosink concat name=c ! audioconvert ! audiorate ! alsasink filesrc location=a.mp4 ! decodebin name=d1 ! audio/x-raw ! queue ! c. filesrc location=b.mp4 ! decodebin name=d2 ! audio/x-raw ! queue ! c. d1. ! video/x-raw ! queue ! c2. d2. ! video/x-raw ! queue ! c2.

Works like a charm! But instead of autovideosink and alsasink, I need to re-encode and then mux concatenated video and audio streams and write them into a single container (i.e. there should be a single "filesink location=c.mp4" in the pipeline if I understand it correctly) - this way I would receive the content of a.mp4 appended with the content of b.mp4 (a.mp4 + b.mp4 = c.mp4)

Could someone please share a pipeline which demonstrates how to do this ?

解决方案

Ok at least you mentioned filesink .. but you should always post something that you have (some not yet working pipe..) anyway here is the magic pipe:

gst-launch-1.0 -e concat name=c2 ! videoconvert ! x264enc tune=4 ! mp4mux name=mp4 ! filesink location=out.mp4 concat name=c ! audioconvert ! voaacenc ! mp4. filesrc location=big.mp4 ! decodebin name=d1 ! audio/x-raw ! queue ! c. filesrc location=big2.mp4 ! decodebin name=d2 ! audio/x-raw ! queue ! c. d1. ! video/x-raw ! queue ! c2. d2. ! video/x-raw ! queue ! c2.

btw you may want to read something about gst-launch

Please notice few things:

1, there is -e switch for gst-launch which causes to send EOS to the pipe which ends the mp4muxing process properly.. otherwise metadata will not be written

2, The pipe does not end automatically.. this is something which you can tune up.. maybe some attribute for concat or something.. Maybe try to add streamsynchronizer - but I was not successful with that one. I am not sure if I shuld put it after concat or before.. maybe you can ask on IRC

3, How do I build such pipe?

A, First I checked the capabilities of mp4mux as I knew I want to mux mp4.. (to find this you may type gst-inspect-1.0 | grep mp4 | grep mux if you are on Linux). You must remember that src is the output of element and sink is the input (sometimes its not very natural to think this way.. just remember sink is that thing where water ends whend you wash your hands :D ). So we expect that there is sink for audio and sink for video..

B, There are few possibilities for audio I choosed audio/mpeg... then I rememberd that mp4 uses aac sometimes.. then I searched for aac encoder which is voaacenc.. I checked the src caps and indeed its audio/mpeg (version 4)

C, For video there is video/x-h264 which I like most .. so I took good old x264enc which I use all the time for video.. I thought that maybe I need to have h264parse but its not needed..

4, Then bundle everything together.. just remember that you can give elements names in a way of using name=something (its exactly like alias) but you do not just use it as something but you need to pu dot afterwards so its something.

5, Also remember that in what order you put elements into gst-launch is more about linking than how is it processed.. think about it that all you need is to link the elements .. and just then think of the processing itself. You can put *sink elements at the beginning of the pipeline but you must then give them name and use that name elsewhere (for example after mux as I did)

6, To simplify it.. input processing then decodebin which spawns two branches - audio and video.. each type goes to proper concat.. there are two concats - each have its own type of processing (for video there is videoconvert etc) .. then those two concat branches goes through encoding and after encoding they end at mp4mux.. after mux there is just filesink.. thats all

这篇关于gstreamer:在concat之后将视频流和audo流都写入单个MP4容器中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆