如何使用ffmpeg通过websocket流mp4 [英] How to use ffmpeg for streaming mp4 via websocket

查看:1211
本文介绍了如何使用ffmpeg通过websocket流mp4的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经在nodejs中编写了一个示例,该示例将一些输入通过mp4格式的Websocket连接流传输到客户端.在客户端,mp4软件包已添加到MediaSourceBuffer.

I've written a sample in nodejs wich streams some input to the client via websocket connection in mp4 format. On the client side, the mp4 packages ar added to a MediaSourceBuffer.

这可以正常运行,但前提是客户端从第一个程序包的开头开始获取流.因此,另一个客户端无法播放当前的流,因为他不会从一开始就获得该流.

This runs fine, but only if the client gets the stream from the beginning with the first package. So another client can't play the current Stream, because he won't get the Stream from the beginning.

我尝试(try& error)保存ffmpeg发送的第一个程序包,并在新连接的开头发送该程序包,然后保存当前的流.然后,MediaSourceBuffer由于编码错误而中断.

I tried (try&error) to save the first package ffmpeg sends and send this at the beginning of a new connection, then the current stream. Then the MediaSourceBuffer breaks because of encoding error..

这是ffmpeg命令:

Here is the ffmpeg command :

-i someInput -g 59 
-vcodec libx264 -profile:v baseline 
-f mp4 -movflags empty_moov+omit_tfhd_offset+frag_keyframe+default_base_moof
-reset_timestamps 1
-

"empty_moov + omit_tfhd_offset + frag_keyframe + default_base_moof"部分应使Streampackages独立,将moovatom放在每个部分的开头,并按关键帧将每个部分的大小调整为59帧,所以我不明白为什么我无法查看流从开始后开始.

The part "empty_moov+omit_tfhd_offset+frag_keyframe+default_base_moof" should make the Streampackages independet in putting the moovatom at the beginning of each part and sizing the parts in 59 frames each by keyframe, so i dont get it why i cant view the Stream beginning after the start.

推荐答案

该命令的输出本身不是流".它是一系列串联的片段.每个片段都必须完整接收.如果收到部分片段,它将使解析器混乱到无法识别下一个片段开始的地步.另外,第一个片段输出称为初始化片段.此初始化片段必须首先发送到客户端.之后,可以播放任何片段.因此,它必须由服务器缓存.

The output of that command is not a 'stream' per se. It is series of concatenated fragments. Each fragments must be received in its entirety. If a partial fragment is received it will confuse the parser to the point where it can not identify the start of the next fragment. In addition, the first fragment output is called an initialization fragment. This initialization fragment must be sent to the client first. After that any fragment can be played. Hence it must be cached by the server.

这篇关于如何使用ffmpeg通过websocket流mp4的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆