显示带有媒体流扩展(MSE)的getUserMediaStream实时视频 [英] Display getUserMediaStream live video with media stream extensions (MSE)

查看:174
本文介绍了显示带有媒体流扩展(MSE)的getUserMediaStream实时视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试显示使用getUserMedia从网络摄像头获取的MediaStream,并使用可能的播放机制将其中继到远程对等设备(作为实验).我不是直接使用webRTC,因为我想控制原始数据.

I am trying to display a MediaStream taken from a webcam using getUserMedia, and to relay it to a remote peer using whatever mechanism possible for it to be played (as an experiment). I am not using webRTC directly as I want control over the raw data.

我遇到的问题是我的视频元素什么也不显示,也没有收到任何错误.我在基本操作系统(基于Ubuntu 14.04的Linux操作系统)上使用Chrome版本51.0.2704.103(64位).

The issue I encounter is that my video element displays nothing, and I don't get any errors back. I am using Chrome Version 51.0.2704.103 (64-bit) on Elementary OS (Ubuntu 14.04 based linux OS).

作为一个旁注,如果我将所有Blob记录到一个数组中,然后创建一个新Blob,并将视频的src元素设置为URL.createObjectUrl(blob),它将正确显示视频.

As a sidenote, if I record all the blobs into an array and then create a new blob and set the video's src element to URL.createObjectUrl(blob), it displays video correctly.

这是我尝试完成此操作的代码(减去中继,我只是尝试在本地播放):

Here is the code I tried to accomplish this (minus the relaying, I'm just trying to play it locally):

var ms = new MediaSource();
var video = document.querySelector("video"); 
video.src = window.URL.createObjectURL(ms);

ms.addEventListener("sourceopen", function() {
    var sourceBuffer = ms.addSourceBuffer('video/webm; codecs="vorbis,vp8"');

    navigator.getUserMedia({video: {width: 320, height: 240, framerate: 30}, audio: true}, function(stream) {
        var recorder = new MediaRecorder(stream);

        recorder.ondataavailable = function(event) {
            var reader = new FileReader();
            reader.addEventListener("loadend", function () {
                var uint8Chunk = new Uint8Array(reader.result);
                if (!sourceBuffer.updating) {
                    sourceBuffer.appendBuffer(uint8Chunk);
                }
                if (video.paused) video.play();
            });
            reader.readAsArrayBuffer(event.data);
        };

        recorder.start(10);
    }, function(error) {
        console.error(error);
    });
}, false);

这是我在chrome://media-internal中获得的信息:

Here is the info I get in chrome://media-internal:

render_id: 147
player_id: 0
pipeline_state: kPlaying
event: WEBMEDIAPLAYER_CREATED
url: blob:http%3A//localhost%3A8080/e5c51dd8-5709-4e6f-9457-49ac8c34756b
found_audio_stream: true
audio_codec_name: opus
found_video_stream: true
video_codec_name: vp8
duration: unknown
audio_dds: false
audio_decoder: OpusAudioDecoder
video_dds: false
video_decoder: FFmpegVideoDecoder

也是日志:

00:00:00 00 pipeline_state  kCreated
00:00:00 00 event   WEBMEDIAPLAYER_CREATED
00:00:00 00 url blob:http%3A//localhost%3A8080/e5c51dd8-5709-4e6f-9457-49ac8c34756b
00:00:00 00 pipeline_state  kInitDemuxer
00:00:01 603    found_audio_stream  true
00:00:01 603    audio_codec_name    opus
00:00:01 603    found_video_stream  true
00:00:01 603    video_codec_name    vp8
00:00:01 604    duration    unknown
00:00:01 604    pipeline_state  kInitRenderer
00:00:01 604    audio_dds   false
00:00:01 604    audio_decoder   OpusAudioDecoder
00:00:01 604    video_dds   false
00:00:01 604    video_decoder   FFmpegVideoDecoder
00:00:01 604    pipeline_state  kPlaying

更新:我尝试将数据发送到节点并使用ffmpeg(fluent-ffmpeg)将其保存到webm文件中,并且可以在VLC中正确查看该文件.

Update: I've tried sending the data to node and saving it to a webm file with ffmpeg (fluent-ffmpeg), and I can view the file in VLC correctly.

更新2:从节点流回流后,得到以下内容:Media segment did not contain any video coded frames, mismatching initialization segment. Therefore, MSE coded frame processing may not interoperably detect discontinuities in appended media. .在进行了一些研究之后,似乎必须对webm文件进行分段才能工作,但是对于实时流,我还没有遇到实现此目的的方法(使用ffmpeg或其他工具).这里有什么想法吗?

Update 2: After streaming it back from node, I get the following: Media segment did not contain any video coded frames, mismatching initialization segment. Therefore, MSE coded frame processing may not interoperably detect discontinuities in appended media. . After doing some research, it appears that webm files must be segmented to work, however I have not come across a way to do this (either using ffmpeg or other tools) for live streams. Any ideas here?

推荐答案

有点晚了,但是您可以这样尝试(在chrome中):

A little late, but you can try it like this (in chrome):

<html>

<body>
    <video class="real1" autoplay controls></video>
    <video class="real2" controls></video>

    <script>
        const constraints = {video: {width: 320, height: 240, framerate: 30}, audio: true};

        const video1 = document.querySelector('.real1');
        const video2 = document.querySelector('.real2');

        var mediaSource = new MediaSource();
        video2.src = window.URL.createObjectURL(mediaSource);
        var sourceBuffer;
        mediaSource.addEventListener('sourceopen', function () {
            sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs=opus,vp8');
            console.log(sourceBuffer);
        })

        var isFirst = true;
        var mediaRecorder;
        var i = 0;
        function handleSuccess(stream) {
            video1.srcObject = stream;
            mediaRecorder = new MediaRecorder(stream, { mimeType: 'video/webm; codecs=opus,vp8' });
            console.log(mediaRecorder.mimeType)
            mediaRecorder.ondataavailable = function (e) {
                var reader = new FileReader();
                reader.onload = function (e) {              
                    sourceBuffer.appendBuffer(new Uint8Array(e.target.result));
                }
                reader.readAsArrayBuffer(e.data);

                if (video2.paused) {
                    video2.play(0);
                }
            }
            mediaRecorder.start(20);
        }

        function handleError(error) {
            console.error('Reeeejected!', error);
        }
        navigator.mediaDevices.getUserMedia(constraints).
            then(handleSuccess).catch(handleError);
    </script>
</body>

</html>

我认为您错过了同时为记录器和sourceBuffer设置相同(受支持的)编解码器的情况.

I think you missed setting the same (supported) codec to both, recorder and sourceBuffer.

这篇关于显示带有媒体流扩展(MSE)的getUserMediaStream实时视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆