将 http 实时流式传输到 HTML5 视频客户端的最佳方法 [英] Best approach to real time http streaming to HTML5 video client

查看:42
本文介绍了将 http 实时流式传输到 HTML5 视频客户端的最佳方法的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我真的很难理解使用 node.js 将 ffmpeg 的实时输出流式传输到 HTML5 客户端的最佳方式,因为有许多变量在起作用,而我在这方面没有很多经验这个空间,花了很多时间尝试不同的组合.

我的用例是:

1) IP 视频摄像机 RTSP H.264 流由 FFMPEG 拾取并使用节点中的以下 FFMPEG 设置重新混合到 mp4 容器中,输出到 STDOUT.这仅在初始客户端连接上运行,因此部分内容请求不会再次尝试生成 FFMPEG.

liveFFMPEG = child_process.spawn("ffmpeg", ["-i", "rtsp://admin:12345@192.168.1.234:554", "-vcodec", "copy", "-f","mp4", "-reset_timestamps", "1", "-movflags", "frag_keyframe+empty_moov","-"//输出到标准输出], {分离:假});

2) 我使用节点 http 服务器来捕获 STDOUT 并根据客户端请求将其流式传输回客户端.当客户端第一次连接时,我会生成上面的 FFMPEG 命令行,然后将 STDOUT 流通过管道传输到 HTTP 响应.

liveFFMPEG.stdout.pipe(resp);

我还使用流事件将 FFMPEG 数据写入 HTTP 响应,但没有任何区别

xliveFFMPEG.stdout.on("data",function(data) {响应写入(数据);}

我使用以下 HTTP 标头(在流式传输预先录制的文件时也使用和工作)

var total = 999999999//伪造一个大文件var 部分启动 = 0var 部分结束 = 总计 - 1如果(范围!== 未定义){var parts = range.replace(/bytes=/, "").split("-");var partialstart = part[0];var partialend = 部分[1];}var start = parseInt(partialstart, 10);var 结束 = 部分结束?parseInt(partialend, 10) :总计;//如果没有范围请求,则伪造一个大文件var chunksize = (end-start)+1;resp.writeHead(206, {传输编码":分块", '内容类型': '视频/mp4', 'Content-Length': chunksize//伪造文件的大尺寸, 'Accept-Ranges': 'bytes' + start + "-" + end + "/" + total});

3) 客户端必须使用 HTML5 视频标签.

我对 HTML5 客户端流式播放(使用 fs.createReadStream 和 206 HTTP 部分内容)没有问题,这是以前使用上述 FFMPEG 命令行录制的视频文件(但保存到文件而不是 STDOUT),所以我知道FFMPEG流是正确的,我什至可以在连接到HTTP节点服务器时正确看到VLC中的视频直播.

但是,尝试通过节点 HTTP 从 FFMPEG 进行直播似乎要困难得多,因为客户端将显示一帧然后停止.我怀疑问题在于我没有设置与 HTML5 视频客户端兼容的 HTTP 连接.我尝试了多种方法,例如使用 HTTP 206(部分内容)和 200 个响应,将数据放入缓冲区,然后在没有运气的情况下进行流式传输,因此我需要回到首要原则以确保我设置正确方式.

这是我对这应该如何工作的理解,如果我错了,请纠正我:

1) FFMPEG 应该设置为对输出进行分段并使用空的 moov(FFMPEG frag_keyframe 和 empty_moov mov 标志).这意味着客户端不使用 moov atom,它通常位于文件末尾,这在流式传输时不相关(无文件结尾),但意味着无法进行查找,这对我的用例来说很好.

2) 即使我使用 MP4 片段和空 MOOV,我仍然必须使用 HTTP 部分内容,因为 HTML5 播放器会等到整个流下载后才能播放,而直播流永远不会结束,因此无法使用.

3) 我不明白为什么在实时流式传输时将 STDOUT 流传输到 HTTP 响应不起作用,如果我保存到文件,我可以使用类似的代码轻松地将此文件流式传输到 HTML5 客户端.也许这是一个时间问题,因为 FFMPEG spawn 需要一秒钟才能启动,连接到 IP 摄像机并将块发送到节点,并且节点数据事件也是不规则的.然而,字节流应该与保存到文件完全相同,并且 HTTP 应该能够满足延迟.

4) 从摄像头流式传输 FFMPEG 创建的 MP4 文件时,从 HTTP 客户端检查网络日志时,我看到有 3 个客户端请求: 对视频的一般 GET 请求,HTTP 服务器返回大约 40Kb,然后是文件最后 10K 字节范围的部分内容请求,然后是未加载中间位的最终请求.也许 HTML5 客户端一旦收到第一个响应就会要求文件的最后一部分加载 MP4 MOOV 原子?如果是这种情况,它将无法用于流式传输,因为没有 MOOV 文件并且没有文件结尾.

5) 在尝试直播时检查网络日志时,我收到一个中止的初始请求,只收到了大约 200 个字节,然后重新请求再次中止了 200 个字节,第三个请求只有 2K 长.我不明白为什么 HTML5 客户端会中止请求,因为字节流与从录制文件流式传输时可以成功使用的字节流完全相同.节点似乎也没有将 FFMPEG 流的其余部分发送到客户端,但我可以在 .on 事件例程中看到 FFMPEG 数据,因此它正在到达 FFMPEG 节点 HTTP 服务器.

6) 虽然我认为将 STDOUT 流传输到 HTTP 响应缓冲区应该可以工作,但我是否必须构建一个中间缓冲区和流,以允许 HTTP 部分内容客户端请求像它(成功)那样正常工作读取文件?我认为这是我出现问题的主要原因,但是我不确定在 Node 中如何最好地设置它.而且我不知道如何处理客户端对文件末尾数据的请求,因为没有文件结尾.

7) 我是否在尝试处理 206 个部分内容请求时走错了路,这是否适用于正常的 200 个 HTTP 响应?HTTP 200 响应适用于 VLC,所以我怀疑 HTML5 视频客户端只能处理部分内容请求?

由于我仍在学习这些东西,因此很难解决这个问题的各个层面(FFMPEG、节点、流媒体、HTTP、HTML5 视频),因此我们将不胜感激.我花了几个小时在这个网站和网络上进行研究,我还没有遇到任何能够在 node 中进行实时流式传输的人,但我不能成为第一个,我认为这应该能够工作(不知何故!).

解决方案

编辑 3:从 IOS 10 开始,HLS 将支持碎片化的 mp4 文件.答案现在,是使用 DASH 和 HLS 清单创建碎片化的 mp4 资产.> 假装flash,iOS9 及以下,IE 10 及以下不存在.

此行以下的所有内容都已过时.留在这里以供后代使用.

<小时><块引用>

编辑 2:正如评论中的人们指出的那样,事情发生了变化.几乎所有浏览器都支持 AVC/AAC 编解码器.iOS 仍然需要 HLS.但是通过像 hls.js 这样的适配器你可以玩MSE 中的 HLS.如果您需要 iOS,新的答案是 HLS+hls.js.要不就碎片化的 MP4(即 DASH),如果你不这样做

视频,特别是直播视频非常困难的原因有很多.(请注意,原始问题指定 HTML5 视频是一项要求,但提问者在评论中表示可以使用 Flash.因此,这个问题立即具有误导性)

首先我要重申:没有官方支持通过 HTML5 进行直播.有黑客,但您的里程可能会有所不同.

<块引用>

自从我写了这个答案媒体源扩展已经成熟,并且现在非常接近成为一个可行的选择.他们得到支持在大多数主要浏览器上.IOS 仍然是一个阻碍.

接下来,您需要了解视频点播 (VOD) 和直播视频是非常不同的.是的,它们都是视频,但问题不同,因此格式不同.例如,如果您计算机中的时钟运行速度比应有的速度快 1%,您将不会在 VOD 上注意到.使用实时视频,您将尝试在视频发生之前播放.如果要加入正在进行的实时视频流,则需要初始化解码器所需的数据,因此必须在流中重复,或带外发送.使用 VOD,您可以随心所欲地阅读他们寻找的文件的开头.

现在让我们深入挖掘一下.

平台:

  • iOS
  • 电脑
  • 苹果
  • 安卓

编解码器:

  • vp8/9
  • h.264
  • 托拉 (vp3)

浏览器中直播视频的常见交付方式:

  • DASH (HTTP)
  • HLS (HTTP)
  • 闪存 (RTMP)
  • 闪存 (HDS)

浏览器中常见的点播传送方式:

  • DASH(HTTP 流媒体)
  • HLS(HTTP 流)
  • 闪存 (RTMP)
  • 闪存(HTTP 流)
  • MP4(HTTP 伪流)
  • 我不打算谈论 MKV 和 OOG,因为我不太了解它们.

html5 视频标签:

  • MP4
  • webm
  • 奥格
<小时>

让我们看看哪些浏览器支持哪些格式

Safari:

  • HLS(仅限 iOS 和 Mac)
  • h.264
  • MP4

火狐

  • DASH(通过 MSE 但没有 h.264)
  • h.264 仅通过 Flash!
  • VP9
  • MP4
  • OGG
  • Webm

浏览器

  • 闪光
  • DASH(仅通过 MSE IE 11+)
  • h.264
  • MP4

  • 闪光
  • DASH(通过 MSE)
  • h.264
  • VP9
  • MP4
  • webm
  • 奥格

MP4 不能用于实时视频(注意:DASH 是 MP4 的超集,所以不要混淆).MP4 分为两部分:moov 和 mdat.mdat 包含原始音频视频数据.但是它没有被索引,所以没有moov,它是没有用的.moov 包含 mdat 中所有数据的索引.但是由于它的格式,在知道每个帧的时间戳和大小之前,它不能被展平".有可能构建一个fibs"帧大小的moov,但在带宽方面是非常浪费的.

因此,如果您想随处交付,我们需要找到最小公分母.你会看到这里没有液晶显示器,不用闪光灯例子:

  • iOS 仅支持 h.264 视频.并且只支持 HLS 直播.
  • Firefox 根本不支持 h.264,除非您使用 Flash
  • Flash 在 iOS 中不起作用

最接近 LCD 的是使用 HLS 来吸引您的 iOS 用户,并为其他人提供闪存.我个人最喜欢的是编码 HLS,然后使用 flash 为其他人播放 HLS.您可以通过 JW player 6 在 flash 中播放 HLS,(或像我一样在 AS3 中将自己的 HLS 写入 FLV)

很快,最常见的方法将是 iOS/Mac 上的 HLS 和其他地方的 DASH 通过 MSE(这是 Netflix 即将要做的).但我们仍在等待每个人升级他们的浏览器.您可能还需要一个单独的 DASH/VP9 用于 Firefox(我知道 open264;它很糟糕.它不能在主要或高调中播放视频.所以它目前没用).

I'm really stuck trying to understand the best way to stream real time output of ffmpeg to a HTML5 client using node.js, as there are a number of variables at play and I don't have a lot of experience in this space, having spent many hours trying different combinations.

My use case is:

1) IP video camera RTSP H.264 stream is picked up by FFMPEG and remuxed into a mp4 container using the following FFMPEG settings in node, output to STDOUT. This is only run on the initial client connection, so that partial content requests don't try to spawn FFMPEG again.

liveFFMPEG = child_process.spawn("ffmpeg", [
                "-i", "rtsp://admin:12345@192.168.1.234:554" , "-vcodec", "copy", "-f",
                "mp4", "-reset_timestamps", "1", "-movflags", "frag_keyframe+empty_moov", 
                "-"   // output to stdout
                ],  {detached: false});

2) I use the node http server to capture the STDOUT and stream that back to the client upon a client request. When the client first connects I spawn the above FFMPEG command line then pipe the STDOUT stream to the HTTP response.

liveFFMPEG.stdout.pipe(resp);

I have also used the stream event to write the FFMPEG data to the HTTP response but makes no difference

xliveFFMPEG.stdout.on("data",function(data) {
        resp.write(data);
}

I use the following HTTP header (which is also used and working when streaming pre-recorded files)

var total = 999999999         // fake a large file
var partialstart = 0
var partialend = total - 1

if (range !== undefined) {
    var parts = range.replace(/bytes=/, "").split("-"); 
    var partialstart = parts[0]; 
    var partialend = parts[1];
} 

var start = parseInt(partialstart, 10); 
var end = partialend ? parseInt(partialend, 10) : total;   // fake a large file if no range reques 

var chunksize = (end-start)+1; 

resp.writeHead(206, {
                  'Transfer-Encoding': 'chunked'
                 , 'Content-Type': 'video/mp4'
                 , 'Content-Length': chunksize // large size to fake a file
                 , 'Accept-Ranges': 'bytes ' + start + "-" + end + "/" + total
});

3) The client has to use HTML5 video tags.

I have no problems with streaming playback (using fs.createReadStream with 206 HTTP partial content) to the HTML5 client a video file previously recorded with the above FFMPEG command line (but saved to a file instead of STDOUT), so I know the FFMPEG stream is correct, and I can even correctly see the video live streaming in VLC when connecting to the HTTP node server.

However trying to stream live from FFMPEG via node HTTP seems to be a lot harder as the client will display one frame then stop. I suspect the problem is that I am not setting up the HTTP connection to be compatible with the HTML5 video client. I have tried a variety of things like using HTTP 206 (partial content) and 200 responses, putting the data into a buffer then streaming with no luck, so I need to go back to first principles to ensure I'm setting this up the right way.

Here is my understanding of how this should work, please correct me if I'm wrong:

1) FFMPEG should be setup to fragment the output and use an empty moov (FFMPEG frag_keyframe and empty_moov mov flags). This means the client does not use the moov atom which is typically at the end of the file which isn't relevant when streaming (no end of file), but means no seeking possible which is fine for my use case.

2) Even though I use MP4 fragments and empty MOOV, I still have to use HTTP partial content, as the HTML5 player will wait until the entire stream is downloaded before playing, which with a live stream never ends so is unworkable.

3) I don't understand why piping the STDOUT stream to the HTTP response doesn't work when streaming live yet if I save to a file I can stream this file easily to HTML5 clients using similar code. Maybe it's a timing issue as it takes a second for the FFMPEG spawn to start, connect to the IP camera and send chunks to node, and the node data events are irregular as well. However the bytestream should be exactly the same as saving to a file, and HTTP should be able to cater for delays.

4) When checking the network log from the HTTP client when streaming a MP4 file created by FFMPEG from the camera, I see there are 3 client requests: A general GET request for the video, which the HTTP server returns about 40Kb, then a partial content request with a byte range for the last 10K of the file, then a final request for the bits in the middle not loaded. Maybe the HTML5 client once it receives the first response is asking for the last part of the file to load the MP4 MOOV atom? If this is the case it won't work for streaming as there is no MOOV file and no end of the file.

5) When checking the network log when trying to stream live, I get an aborted initial request with only about 200 bytes received, then a re-request again aborted with 200 bytes and a third request which is only 2K long. I don't understand why the HTML5 client would abort the request as the bytestream is exactly the same as I can successfully use when streaming from a recorded file. It also seems node isn't sending the rest of the FFMPEG stream to the client, yet I can see the FFMPEG data in the .on event routine so it is getting to the FFMPEG node HTTP server.

6) Although I think piping the STDOUT stream to the HTTP response buffer should work, do I have to build an intermediate buffer and stream that will allow the HTTP partial content client requests to properly work like it does when it (successfully) reads a file? I think this is the main reason for my problems however I'm not exactly sure in Node how to best set that up. And I don't know how to handle a client request for the data at the end of the file as there is no end of file.

7) Am I on the wrong track with trying to handle 206 partial content requests, and should this work with normal 200 HTTP responses? HTTP 200 responses works fine for VLC so I suspect the HTML5 video client will only work with partial content requests?

As I'm still learning this stuff its difficult to work through the various layers of this problem (FFMPEG, node, streaming, HTTP, HTML5 video) so any pointers will be greatly appreciated. I have spent hours researching on this site and the net, and I have not come across anyone who has been able to do real time streaming in node but I can't be the first, and I think this should be able to work (somehow!).

解决方案

EDIT 3: As of IOS 10, HLS will support fragmented mp4 files. The answer now, is to create fragmented mp4 assets, with a DASH and HLS manifest. > Pretend flash, iOS9 and below and IE 10 and below don't exist.

Everything below this line is out of date. Keeping it here for posterity.


EDIT 2: As people in the comments are pointing out, things change. Almost all browsers will support AVC/AAC codecs. iOS still requires HLS. But via adaptors like hls.js you can play HLS in MSE. The new answer is HLS+hls.js if you need iOS. or just Fragmented MP4 (i.e. DASH) if you don't

There are many reasons why video and, specifically, live video is very difficult. (Please note that the original question specified that HTML5 video is a requirement, but the asker stated Flash is possible in the comments. So immediately, this question is misleading)

First I will restate: THERE IS NO OFFICIAL SUPPORT FOR LIVE STREAMING OVER HTML5. There are hacks, but your mileage may vary.

EDIT: since I wrote this answer Media Source Extensions have matured, and are now very close to becoming a viable option. They are supported on most major browsers. IOS continues to be a hold out.

Next, you need to understand that Video on demand (VOD) and live video are very different. Yes, they are both video, but the problems are different, hence the formats are different. For example, if the clock in your computer runs 1% faster than it should, you will not notice on a VOD. With live video, you will be trying to play video before it happens. If you want to join a a live video stream in progress, you need the data necessary to initialize the decoder, so it must be repeated in the stream, or sent out of band. With VOD, you can read the beginning of the file them seek to whatever point you wish.

Now let's dig in a bit.

Platforms:

  • iOS
  • PC
  • Mac
  • Android

Codecs:

  • vp8/9
  • h.264
  • thora (vp3)

Common Delivery methods for live video in browsers:

  • DASH (HTTP)
  • HLS (HTTP)
  • flash (RTMP)
  • flash (HDS)

Common Delivery methods for VOD in browsers:

  • DASH (HTTP Streaming)
  • HLS (HTTP Streaming)
  • flash (RTMP)
  • flash (HTTP Streaming)
  • MP4 (HTTP pseudo streaming)
  • I'm not going to talk about MKV and OOG because I do not know them very well.

html5 video tag:

  • MP4
  • webm
  • ogg

Lets look at which browsers support what formats

Safari:

  • HLS (iOS and mac only)
  • h.264
  • MP4

Firefox

  • DASH (via MSE but no h.264)
  • h.264 via Flash only!
  • VP9
  • MP4
  • OGG
  • Webm

IE

  • Flash
  • DASH (via MSE IE 11+ only)
  • h.264
  • MP4

Chrome

  • Flash
  • DASH (via MSE)
  • h.264
  • VP9
  • MP4
  • webm
  • ogg

MP4 cannot be used for live video (NOTE: DASH is a superset of MP4, so don't get confused with that). MP4 is broken into two pieces: moov and mdat. mdat contains the raw audio video data. But it is not indexed, so without the moov, it is useless. The moov contains an index of all data in the mdat. But due to its format, it can not be 'flattened' until the timestamps and size of EVERY frame is known. It may be possible to construct an moov that 'fibs' the frame sizes, but is is very wasteful bandwidth wise.

So if you want to deliver everywhere, we need to find the least common denominator. You will see there is no LCD here without resorting to flash example:

  • iOS only supports h.264 video. and it only supports HLS for live.
  • Firefox does not support h.264 at all, unless you use flash
  • Flash does not work in iOS

The closest thing to an LCD is using HLS to get your iOS users, and flash for everyone else. My personal favorite is to encode HLS, then use flash to play HLS for everyone else. You can play HLS in flash via JW player 6, (or write your own HLS to FLV in AS3 like I did)

Soon, the most common way to do this will be HLS on iOS/Mac and DASH via MSE everywhere else (This is what Netflix will be doing soon). But we are still waiting for everyone to upgrade their browsers. You will also likely need a separate DASH/VP9 for Firefox (I know about open264; it sucks. It can't do video in main or high profile. So it is currently useless).

这篇关于将 http 实时流式传输到 HTML5 视频客户端的最佳方法的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆