来自Websocket的HTML5视频流,通过MediaSource和MediaSourceBuffer [英] HTML5 Video stream from websocket via MediaSource and MediaSourceBuffer

查看:1594
本文介绍了来自Websocket的HTML5视频流,通过MediaSource和MediaSourceBuffer的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试从websocket播放视频

<video id="output" width="320" height="240" autoplay></video>

<script>
    function sockets(buffer) {
        const socket = new WebSocket('wss://localhost:5002/ws')

        socket.onmessage = async function (event) {
            // event.data is a blob
            buffer.appendBuffer(new Uint8Array(event.data))
        }
    }

    let ms = new MediaSource()
    let output = document.getElementById('output')
    output.src = URL.createObjectURL(ms)
    ms.onsourceopen = () => {
        let buffer = ms.addSourceBuffer('video/webm; codecs="vorbis,vp8"')
        sockets(buffer)
    }
</script>

我在这里以Blob形式接收MediaRecorder块,并尝试使用MediaSource API顺序播放它们.没有错误,没有任何反应.这里有什么根本不对的地方吗?

我尝试过:

  • 使用不同的编解码器
  • 使用媒体源模式播放,例如序列/段
  • 我还尝试了不使用MediaSource API的其他方式,但面临其他挑战,MediaSource似乎是我的最佳选择.

更新:这是视频的制作方式:

let options = { mimeType: 'video/webm;codecs=vp8' }
let stream = await navigator.mediaDevices.getUserMedia({ video: true })
mediaRecorder = new MediaRecorder(stream, options)
mediaRecorder.ondataavailable = event => {
    if (event.data && event.data.size > 0) {
        send(event.data)
    }
}

解决方案

这里的根本问题是,您无法流式传输MediaRecorder中的数据,而不能期望另一端播放该数据.这不是完整的视频.它只有在接收端能够接收初始化字节的情况下才能工作-我怀疑这是否可以在实际情况下使用.

您可以做的是创建一个间隔,例如每1秒启动/停止MediaRecorder,以生成1秒的视频块,您可以通过网络传输该视频块(据我所知并测试的是websockets) >

我强烈建议您不要使用MediaRecorder,因为您正在进行实时视频流传输,但您的帖子中并未对此进行说明,但是如果是这样,最好创建一个画布来复制流并执行一些requestAnimationFrame可以将您的视频流捕获为可以传输的内容.

看看这个演示以供参考: https://github.com/cyberquarks/quarkus-websockets-streamer/blob/master/src/main/resources/META-INF/resources/index.html

在我的体验中,

MediaRecorder的响应被延迟,这通常会在视频中增加相当大的延迟,更不用说套接字也会引入的延迟了.

通常,其他开发人员会建议您只采用WebRTC路线,但是根据我的经验,WebRTC通常也不会更快.

I'm trying to play video from websocket

<video id="output" width="320" height="240" autoplay></video>

<script>
    function sockets(buffer) {
        const socket = new WebSocket('wss://localhost:5002/ws')

        socket.onmessage = async function (event) {
            // event.data is a blob
            buffer.appendBuffer(new Uint8Array(event.data))
        }
    }

    let ms = new MediaSource()
    let output = document.getElementById('output')
    output.src = URL.createObjectURL(ms)
    ms.onsourceopen = () => {
        let buffer = ms.addSourceBuffer('video/webm; codecs="vorbis,vp8"')
        sockets(buffer)
    }
</script>

I receive MediaRecorder chunks here as Blobs and try to sequentially play them using MediaSource API. No errors and nothing happens. Is there something fundamentally wrong here?

I tried:

  • To use different codecs
  • Played with media source modes e.g sequence/segments
  • I was also trying different ways where you don't use MediaSource API but faced other challenges and MediaSource seems to be the best approach in my case.

UPDATE: this is how the video is produced:

let options = { mimeType: 'video/webm;codecs=vp8' }
let stream = await navigator.mediaDevices.getUserMedia({ video: true })
mediaRecorder = new MediaRecorder(stream, options)
mediaRecorder.ondataavailable = event => {
    if (event.data && event.data.size > 0) {
        send(event.data)
    }
}

解决方案

The fundamental problem here is you cannot stream those data coming out of MediaRecorder and expect the other end to play it; it is not a complete video. It will only work if the receiving end is able to receive the initialization bytes--which I doubt that will work in a real-world scenario.

What you can do is to create an interval that will start/stop the MediaRecorder for example every 1 second to make 1 second video chunks that you can transmit over the wire (best I know and tested is websockets)

I strongly suggest not to use MediaRecorder is you are doing real-time video streaming which was not indicated in your post, but if yes, it would be better that you create a canvas to copy the stream and do some requestAnimationFrame stuff that can capture your video stream into a something you can transmit.

Take a look at this demo for reference: https://github.com/cyberquarks/quarkus-websockets-streamer/blob/master/src/main/resources/META-INF/resources/index.html

MediaRecorder in my experience response is delayed that would generally add quite a delay in the video, not to mention the delay that the socket would also introduce.

Generally, other developers would suggest that you just take the WebRTC route, however based on my experience also WebRTC is not generally faster.

这篇关于来自Websocket的HTML5视频流,通过MediaSource和MediaSourceBuffer的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆