使用视频标签实时流式传输到HTML5(没有webrtc) [英] Real Time Streaming to HTML5 (with out webrtc) just using video tag

查看:520
本文介绍了使用视频标签实时流式传输到HTML5(没有webrtc)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想将实时编码数据包装到webm或ogv并将其发送到html5浏览器。

I would like to wrap real time encoded data to webm or ogv and send it to an html5 browser.

webm或ogv可以这样做,
由于其MDAT原子,Mp4不能这样做。 (一个人不能实时包装h264和mp3并将其包装并发送到客户端)
说我从我的网络摄像头输入输入和内置麦克风的音频。
Fragmented mp4可以解决这个问题,但找到libs就很麻烦了。)

Can webm or ogv do this, Mp4 can not do this due to its MDAT atoms. (one can not wrap h264 and mp3 in real time and wrap it and send it to the client) Say I am feeding the input from my webcam and audio from my built in mic. Fragmented mp4 can handle this but its an hassle to find libs to do that).

我需要这样做因为我不想发送音频和视频分离。

I need to do this cuz I do not want to send audio and video separably.

如果我确实发送了它,通过音频标签和视频通过视频发送音频>(音频和视频被解复用并发送)
我可以同步它们吗?用javascript的客户端浏览器。我看到了一些例子,但还不确定。

If I did send it separably, sending audio over audio tag and video over video>(audio and video are demuxed and sent) Can I sync them on client browser with javascript. I saw some examples but not sure yet.

推荐答案

Evren,

由于您最初提出此问题,因此媒体来源扩展程序为
https:// www.w3.org/TR/media-source/ 已经足够成熟,能够播放非常短(30ms)的ISO-BMFF视频/ mp4片段,只需要一点点缓冲。

Since you have asked this question initially, the Media Source Extensions https://www.w3.org/TR/media-source/ have matured enough to be able to play very short (30ms) ISO-BMFF video/mp4 segments with just a little buffering.

请参阅
HTML5直播

所以你的陈述


(一个人不能实时包装h264和mp3将其包装并发送给客户端)

(one can not wrap h264 and mp3 in real time and wrap it and send it to the client)

现在已经过时了。是的,你可以用h264 + AAC来做。

is out of date now. Yes you can do it with h264 + AAC.

有几种实现方式;看看虚幻媒体服务器。
来自虚幻媒体服务器常见问题解答: http://umediaserver.net/umediaserver/faq.html

There are several implementations out there; take a look at Unreal Media Server. From Unreal Media Server FAQ: http://umediaserver.net/umediaserver/faq.html


虚幻HTML5实时流媒体与MPEG-DASH有何不同?
与MPEG不同-DASH,虚幻媒体服务器使用WebSocket协议实时流式传输到Web浏览器中的HTML5 MSE元素。这比通过每个MPEG-DASH的HTTP请求获取段更有效。此外,虚幻媒体服务器发送最短持续时间段,最低30毫秒。这允许低亚秒级延迟流,而MPEG-DASH与其他基于HTTP块的直播流协议一样,无法提供低延迟的直播流。

How is Unreal HTML5 live streaming different from MPEG-DASH? Unlike MPEG-DASH, Unreal Media Server uses a WebSocket protocol for live streaming to HTML5 MSE element in web browsers. This is much more efficient than fetching segments via HTTP requests per MPEG-DASH. Also, Unreal Media Server sends segments of minimal duration, as low as 30 ms. That allows for low, sub-second latency streaming, while MPEG-DASH, like other HTTP chunk-based live streaming protocols, cannot provide low latency live streaming.

他们的演示网页有一个来自RTSP相机的实时HTML5供稿:
http: //umediaserver.net/umediaserver/demos.html
请注意,HTML5播放器中的延迟与Flash播放器中的延迟相当。

Their demos webpage has a live HTML5 feed from RTSP camera: http://umediaserver.net/umediaserver/demos.html Notice that the latency in HTML5 player is comparable to that in Flash player.

这篇关于使用视频标签实时流式传输到HTML5(没有webrtc)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆