更改WebRTC流中的播放延迟 [英] Change playout delay in WebRTC stream

查看:214
本文介绍了更改WebRTC流中的播放延迟的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试将实时MediaStream(最终是从摄像机)从对等方A投射到对等方B,并且我希望对等方B实时接收实时流,然后以增加的延迟对其进行重放.不幸的是,由于无法跳入直播并继续播放,因为它会跳到直播时刻.

I'm trying to cast a live MediaStream (Eventually from the camera) from peerA to peerB and I want peerB to receive the live stream in real time and then replay it with an added delay. Unfortunately in isn't possible to simply pause the stream and resume with play since it jump forward to the live moment.

所以我发现我可以使用MediaRecorder + SourceBuffer重新观看直播.记录流并将缓冲区附加到MSE(SourceBuffer),然后在5秒钟后播放. 这在本地设备(流)上起作用.但是,当我尝试在接收方上使用Media Recorder时,MediaStream(来自pc.onaddstream)看起来好像获取了一些数据,并且能够将缓冲区追加到sourceBuffer上.但是它不能重播.有时候我只得到一帧.

So I have figured out that I can use MediaRecorder + SourceBuffer rewatch the live stream. Record the stream and append the buffers to MSE (SourceBuffer) and play it 5 seconds later. This works grate on the local device (stream). But when I try to use Media Recorder on the receivers MediaStream (from pc.onaddstream) is looks like it gets some data and it's able to append the buffer to the sourceBuffer. however it dose not replay. sometime i get just one frame.

const [pc1, pc2] = localPeerConnectionLoop()
const canvasStream = canvas.captureStream(200)

videoA.srcObject = canvasStream
videoA.play()

// Note: using two MediaRecorder at the same time seem problematic
// But this one works
// stream2mediaSorce(canvasStream, videoB)
// setTimeout(videoB.play.bind(videoB), 5000)

pc1.addTransceiver(canvasStream.getTracks()[0], {
  streams: [ canvasStream ]
})

pc2.onaddstream = (evt) => {
  videoC.srcObject = evt.stream
  videoC.play()

  // Note: using two MediaRecorder at the same time seem problematic
  // THIS DOSE NOT WORK
  stream2mediaSorce(evt.stream, videoD)
  setTimeout(() => videoD.play(), 2000)
}

/**
 * Turn a MediaStream into a SourceBuffer
 * 
 * @param  {MediaStream}      stream   Live Stream to record
 * @param  {HTMLVideoElement} videoElm Video element to play the recorded video in
 * @return {undefined}
 */
function stream2mediaSorce (stream, videoElm) {
  const RECORDER_MIME_TYPE = 'video/webm;codecs=vp9'
  const recorder = new MediaRecorder(stream, { mimeType : RECORDER_MIME_TYPE })

  const mediaSource = new MediaSource()
  videoElm.src = URL.createObjectURL(mediaSource)
  mediaSource.onsourceopen = (e) => {
    sourceBuffer = mediaSource.addSourceBuffer(RECORDER_MIME_TYPE);

    const fr = new FileReader()
    fr.onerror = console.log
    fr.onload = ({ target }) => {
      console.log(target.result)
      sourceBuffer.appendBuffer(target.result)
    }
    recorder.ondataavailable = ({ data }) => {
      console.log(data)
      fr.readAsArrayBuffer(data)
    }
    setInterval(recorder.requestData.bind(recorder), 1000)
  }

  console.log('Recorder created')
  recorder.start() 
}

您知道为什么它不能播放视频吗?

Do you know why it won't play the video?

我创建了一个小提琴,其中包含所有必要的代码来进行尝试,javascript标签是与上述相同的代码,(该html基本上是无关的,不需要更改)

I have created a fiddle with all the necessary code to try it out, the javascript tab is the same code as above, (the html is mostly irrelevant and dose not need to be changed)

有些人试图减少延迟,但是我实际上想将其增加到〜10秒,以重新观看您在高尔夫挥杆或其他方面做错的事情,并且如果可能的话,请完全避免使用MediaRecorder

Some try to reduce the latency, but I actually want to increase it to ~10 seconds to rewatch something you did wrong in a golf swing or something, and if possible avoid MediaRecorder altogether

我在某些RTC扩展程序中找到了称为播放延迟"的内容

I found something called "playout-delay" in some RTC extension

允许发件人控制从捕获到渲染时间的最小和最大延迟

that allows the sender to control the minimum and maximum latency from capture to render time

  • https://webrtc.org/experiments/rtp-hdrext/playout-delay /
    • https://webrtc.org/experiments/rtp-hdrext/playout-delay/
    • 我该如何使用? 对我有帮助吗?

      How can i use it? Will it be of any help to me?

      推荐答案

      更新,有一个新功能可以启用此功能,称为playoutDelayHint.

      Update, there is new feature that will enable this, called playoutDelayHint.

      我们希望为javascript应用程序提供一种方法,使其可以设置其对音频或视频数据的渲染速度的偏好.对于专注于实时体验的应用程序,尽可能快地进行可能会有所帮助.对于其他用户,在网络出现问题时,额外的数据缓冲可能会带来令人窒息的体验. 参考:
      https://discourse.wicg.io/t/hint-attribute-in-webrtc-影响基础音频-视频缓冲/4038

      https://bugs.chromium.org/p/webrtc/issues/detail?id = 10287

      We want to provide means for javascript applications to set their preferences on how fast they want to render audio or video data. As fast as possible might be beneficial for applications which concentrates on real time experience. For others additional data buffering may provide smother experience in case of network issues.

      Refs:
      https://discourse.wicg.io/t/hint-attribute-in-webrtc-to-influence-underlying-audio-video-buffering/4038

      https://bugs.chromium.org/p/webrtc/issues/detail?id=10287

      演示: https://jsfiddle.net/75cnfojy/ 难道我只能在浏览器中设置最大10s,但是要由UA供应商来决定,尽最大可能使用可用资源

      Demo: https://jsfiddle.net/75cnfojy/ doe i was only able to set max 10s in my browser but it's more up to the UA vendor to do it's best it can with the resources available

      const [pc1, pc2] = localPeerConnectionLoop()
      const canvasStream = canvas.captureStream(200)
      
      videoA.srcObject = canvasStream
      videoA.play()
      
      pc1.addTransceiver(canvasStream.getTracks()[0], {
        streams: [ canvasStream ]
      })
      
      pc2.onaddstream = (evt) => {
        videoC.srcObject = evt.stream
        videoC.play()
      }
      
      $dur.onchange = () => {
        pc2.getReceivers()[0].playoutDelayHint = $dur.valueAsNumber
      }

      <h3 style="border-bottom: 1px solid">Original canvas</h3>
      <canvas id="canvas" width="100" height="100"></canvas>
      <script>
      var canvas = document.getElementById("canvas");
      var ctx = canvas.getContext("2d");
      var radius = canvas.height / 2;
      ctx.translate(radius, radius);
      radius = radius * 0.90
      setInterval(drawClock, 1000);
      
      function drawClock() {
        drawFace(ctx, radius);
        drawNumbers(ctx, radius);
        drawTime(ctx, radius);
      }
      
      function drawFace(ctx, radius) {
        var grad;
        ctx.beginPath();
        ctx.arc(0, 0, radius, 0, 2*Math.PI);
        ctx.fillStyle = 'white';
        ctx.fill();
        grad = ctx.createRadialGradient(0,0,radius*0.95, 0,0,radius*1.05);
        grad.addColorStop(0, '#333');
        grad.addColorStop(0.5, 'white');
        grad.addColorStop(1, '#333');
        ctx.strokeStyle = grad;
        ctx.lineWidth = radius*0.1;
        ctx.stroke();
        ctx.beginPath();
        ctx.arc(0, 0, radius*0.1, 0, 2*Math.PI);
        ctx.fillStyle = '#333';
        ctx.fill();
      }
      
      function drawNumbers(ctx, radius) {
        var ang;
        var num;
        ctx.font = radius*0.15 + "px arial";
        ctx.textBaseline="middle";
        ctx.textAlign="center";
        for(num = 1; num < 13; num++){
          ang = num * Math.PI / 6;
          ctx.rotate(ang);
          ctx.translate(0, -radius*0.85);
          ctx.rotate(-ang);
          ctx.fillText(num.toString(), 0, 0);
          ctx.rotate(ang);
          ctx.translate(0, radius*0.85);
          ctx.rotate(-ang);
        }
      }
      
      function drawTime(ctx, radius){
          var now = new Date();
          var hour = now.getHours();
          var minute = now.getMinutes();
          var second = now.getSeconds();
          //hour
          hour=hour%12;
          hour=(hour*Math.PI/6)+
          (minute*Math.PI/(6*60))+
          (second*Math.PI/(360*60));
          drawHand(ctx, hour, radius*0.5, radius*0.07);
          //minute
          minute=(minute*Math.PI/30)+(second*Math.PI/(30*60));
          drawHand(ctx, minute, radius*0.8, radius*0.07);
          // second
          second=(second*Math.PI/30);
          drawHand(ctx, second, radius*0.9, radius*0.02);
      }
      
      function drawHand(ctx, pos, length, width) {
          ctx.beginPath();
          ctx.lineWidth = width;
          ctx.lineCap = "round";
          ctx.moveTo(0,0);
          ctx.rotate(pos);
          ctx.lineTo(0, -length);
          ctx.stroke();
          ctx.rotate(-pos);
      }
      
      function localPeerConnectionLoop(cfg = {sdpSemantics: 'unified-plan'}) {
        const setD = (d, a, b) => Promise.all([a.setLocalDescription(d), b.setRemoteDescription(d)]);
        return [0, 1].map(() => new RTCPeerConnection(cfg)).map((pc, i, pcs) => Object.assign(pc, {
          onicecandidate: e => e.candidate && pcs[i ^ 1].addIceCandidate(e.candidate),
          onnegotiationneeded: async e => {
            try {
              await setD(await pc.createOffer(), pc, pcs[i ^ 1]);
              await setD(await pcs[i ^ 1].createAnswer(), pcs[i ^ 1], pc);
            } catch (e) {
              console.log(e);
            }
          }
        }));
      }
      </script>
      <h3 style="border-bottom: 1px solid">Local peer (PC1)</h3>
      <video id="videoA" muted width="100" height="100"></video>
      
      <h3 style="border-bottom: 1px solid">Remote peer (PC2)</h3>
      <video id="videoC" muted width="100" height="100"></video>
      <label> Change playoutDelayHint
      <input type="number" value="1" id="$dur">
      </label>

      这篇关于更改WebRTC流中的播放延迟的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆