MediaStream捕获画布和音频同时 [英] MediaStream Capture Canvas and Audio Simultaneously

查看:4667
本文介绍了MediaStream捕获画布和音频同时的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在开发一个我想要的项目:


  1. 加载视频js并在画布上显示

  2. 使用MediaStream captureStream()方法和MediaRecorder对象以记录画布的表面和原始视频的音频。

  3. 在HTML视频元素中播放画布和音频的流。

我已经能够通过调整这个WebRTC演示代码在视频元素中显示画布录制: https://webrtc.github.io/samples/src/content/capture/canvas-record/



说,我无法弄清楚如何在画布旁边录制视频的音频。是否可以从两个不同的源/元素创建包含MediaStreamTrack实例的MediaStream?



根据MediaStream API的规范,理论上应该有一些方法来实现这一点:$ b​​ $ b https://w3c.github.io/mediacapture-main/#introduction



>MediaStream API中的两个主要组件是MediaStreamTrack和MediaStream接口。MediaStreamTrack对象表示源自用户代理中的一个媒体源的单一类型的媒体,例如由网络摄像机生成的视频。MediaStream用于将多个MediaStreamTrack对象组合成一个可以在媒体元素中记录或呈现的单元。

解决方案


是否可以从两个不同的源/元素创建包含MediaStreamTrack实例的MediaStream?


是的,您可以使用 MediaStream.addTrack() 方法。

但是Firefox只会将初始流的音轨用于录音机,直到此错误已被修正。






OP已经知道如何获取所有内容,




  • 要从画布中获取videoStream轨道,您可以调用 canvas.captureStream(framerate) 方法。 / p>


  • 要从视频元素获取音频streamTrack,您可以使用WebAudio API,它的 createMediaStreamDestination 方法。
    这将返回一个包含我们的audioStream的MediaStreamDestination节点( dest )。然后,您必须连接从创建的 MediaElementSource 您的视频元素,到 dest
    如果您需要向此信息流添加更多音轨,请将这些信息源连接到 dest




现在我们有两个流,一个用于画布视频,一个用于音频,我们可以使用 canvasStream.addTrack(audioStream。在初始化我们的 > new MediaRecorder(canvasStream)



下面是一个完整的示例, chrome现在可能很快在Firefox中,当他们将修复错误:



  var cStream,aStream,vid,recorder,analyzer,dataArray,bufferLength,chunks = []; function clickHandler(){this.textContent ='stop recording' cStream = canvas.captureStream(30); cStream.addTrack(aStream.getAudioTracks()[0]); recorder = new MediaRecorder(cStream); recorder.start(); recorder.ondataavailable = saveChunks; recorder.onstop = exportStream; this.onclick = stopRecording;}; function exportStream(e){if(chunks.length){var blob = new Blob(chunks)var vidURL = URL.createObjectURL(blob); var vid = document.createElement('video'); vid.controls = true; vid.src = vidURL; vid.onend = function(){URL.revokeObjectURL(vidURL); } document.body.insertBefore(vid,canvas); } else {document.body.insertBefore(document.createTextNode('no data saved'),canvas); }} function saveChunks(e){e.data.size&& chunks.push(e.data);} function stopRecording(){vid.pause(); this.parentNode.removeChild(this); recorder.stop();} function initAudioStream(evt){var audioCtx = new AudioContext(); //从我们的AudioContext创建流var dest = audioCtx.createMediaStreamDestination(); aStream = dest.stream; //将我们的视频元素的输出连接到流var sourceNode = audioCtx.createMediaElementSource(this); sourceNode.connect(dest)//启动视频this.play(); // just for the fancy canvas drawings analyzer = audioCtx.createAnalyser(); sourceNode.connect(analyzer); analyser.fftSize = 2048; bufferLength = analyser.frequencyBinCount; dataArray = new Uint8Array(bufferLength); analyse.getByteTimeDomainData(dataArray); // output to our headphones sourceNode.connect(audioCtx.destination)startCanvasAnim(); rec.onclick = clickHandler; rec.disabled = false;}; var loadVideo = function(){vid = document.createElement('video'); vid.crossOrigin ='anonymous'; vid.oncanplay = initAudioStream; vid.src ='https://dl.dropboxusercontent.com/s/bch2j17v6ny4ako/movie720p.mp4'; } function startCanvasAnim(){// from MDN https://developer.mozilla.org/en/docs/Web/API/AnalyserNode#Examples var canvasCtx = canvas.getContext('2d'); canvasCtx.fillStyle ='rgb(200,200,200)'; canvasCtx.lineWidth = 2; canvasCtx.strokeStyle ='rgb(0,0,0)'; var draw = function(){var drawVisual = requestAnimationFrame(draw); analyse.getByteTimeDomainData(dataArray); canvasCtx.fillRect(0,0,canvas.width,canvas.height); canvasCtx.beginPath(); var sliceWidth = canvas.width * 1.0 / bufferLength; var x = 0; for(var i = 0; i  

 < canvas id =canvaswidth =500height =200>< / canvas>< button id =recdisabled> record< / button>   






Ps :由于FF团队似乎



您也可以使用 new来混合两个曲目MediaStream([track1,track2])

不过,chrome目前是此构造函数的前缀,但由于它支持 addTrack ,它不是真的需要,我们可以带来像丑陋的东西像

  var mixedStream ='MediaStream'在窗口? 
new MediaStream([cStream.getVideoTracks()[0],aStream.getAudioTracks()[0]]):
cStream;
recorder = new MediaRecorder(mixedStream);

FF和chrome的工作小提琴。


I'm working on a project in which I'd like to:

  1. Load a video js and display it on the canvas.
  2. Use filters to alter the appearance of the canvas (and therefore the video).
  3. Use the MediaStream captureStream() method and a MediaRecorder object to record the surface of the canvas and the audio of the original video.
  4. Play the stream of both the canvas and the audio in an HTML video element.

I've been able to display the canvas recording in a video element by tweaking this WebRTC demo code: https://webrtc.github.io/samples/src/content/capture/canvas-record/

That said, I can't figure out how to record the video's audio alongside the canvas. Is it possible to create a MediaStream containing MediaStreamTrack instances from two different sources/elements?

According to the MediaStream API's specs there should theoretically be some way to accomplish this: https://w3c.github.io/mediacapture-main/#introduction

"The two main components in the MediaStream API are the MediaStreamTrack and MediaStream interfaces. The MediaStreamTrack object represents media of a single type that originates from one media source in the User Agent, e.g. video produced by a web camera. A MediaStream is used to group several MediaStreamTrack objects into one unit that can be recorded or rendered in a media element."

解决方案

Is it possible to create a MediaStream containing MediaStreamTrack instances from two different sources/elements?

Yes, you can do it using the MediaStream.addTrack() method.

But Firefox will only use the initial stream's tracks into the Recorder until this bug has been fixed.


OP already known how to get all of it, but here is a reminder for future readers :

  • To get a videoStream track from the canvas, you can call canvas.captureStream(framerate) method.

  • To get an audio streamTrack from a video element you can use the WebAudio API and it's createMediaStreamDestination method. This will return a MediaStreamDestination node (dest) containing our audioStream. You'll then have to connect a MediaElementSource created from your video element, to this dest. If you need to add more audio tracks to this stream, you should connect all these sources to dest.

Now that we've got two streams, one for the canvas video and one for the audio, we can use canvasStream.addTrack(audioStream.getAudioTracks()[0]) just before initializing our new MediaRecorder(canvasStream).

Here is a complete example, that will work only in chrome now, and probably soon in Firefox, when they will have fixed the bug :

var cStream,
  aStream,
  vid,
  recorder,
  analyser,
  dataArray,
  bufferLength,
  chunks = [];

function clickHandler() {

  this.textContent = 'stop recording';
  cStream = canvas.captureStream(30);
  cStream.addTrack(aStream.getAudioTracks()[0]);

  recorder = new MediaRecorder(cStream);
  recorder.start();

  recorder.ondataavailable = saveChunks;

  recorder.onstop = exportStream;

  this.onclick = stopRecording;

};

function exportStream(e) {

  if (chunks.length) {

    var blob = new Blob(chunks)
    var vidURL = URL.createObjectURL(blob);
    var vid = document.createElement('video');
    vid.controls = true;
    vid.src = vidURL;
    vid.onend = function() {
      URL.revokeObjectURL(vidURL);
    }
    document.body.insertBefore(vid, canvas);

  } else {

    document.body.insertBefore(document.createTextNode('no data saved'), canvas);

  }
}

function saveChunks(e) {

  e.data.size && chunks.push(e.data);

}

function stopRecording() {

  vid.pause();
  this.parentNode.removeChild(this);
  recorder.stop();

}

function initAudioStream(evt) {

  var audioCtx = new AudioContext();
  // create a stream from our AudioContext
  var dest = audioCtx.createMediaStreamDestination();
  aStream = dest.stream;
  // connect our video element's output to the stream
  var sourceNode = audioCtx.createMediaElementSource(this);
  sourceNode.connect(dest)
    // start the video
  this.play();

  // just for the fancy canvas drawings
  analyser = audioCtx.createAnalyser();
  sourceNode.connect(analyser);

  analyser.fftSize = 2048;
  bufferLength = analyser.frequencyBinCount;
  dataArray = new Uint8Array(bufferLength);
  analyser.getByteTimeDomainData(dataArray);

  // output to our headphones
  sourceNode.connect(audioCtx.destination)

  startCanvasAnim();

  rec.onclick = clickHandler;
  rec.disabled = false;
};

var loadVideo = function() {

  vid = document.createElement('video');
  vid.crossOrigin = 'anonymous';
  vid.oncanplay = initAudioStream;
  vid.src = 'https://dl.dropboxusercontent.com/s/bch2j17v6ny4ako/movie720p.mp4';

  
}

function startCanvasAnim() {
  // from MDN https://developer.mozilla.org/en/docs/Web/API/AnalyserNode#Examples
  var canvasCtx = canvas.getContext('2d');

  canvasCtx.fillStyle = 'rgb(200, 200, 200)';
  canvasCtx.lineWidth = 2;
  canvasCtx.strokeStyle = 'rgb(0, 0, 0)';

  var draw = function() {

    var drawVisual = requestAnimationFrame(draw);

    analyser.getByteTimeDomainData(dataArray);

    canvasCtx.fillRect(0, 0, canvas.width, canvas.height);
    canvasCtx.beginPath();

    var sliceWidth = canvas.width * 1.0 / bufferLength;
    var x = 0;

    for (var i = 0; i < bufferLength; i++) {

      var v = dataArray[i] / 128.0;
      var y = v * canvas.height / 2;

      if (i === 0) {
        canvasCtx.moveTo(x, y);
      } else {
        canvasCtx.lineTo(x, y);
      }

      x += sliceWidth;
    }

    canvasCtx.lineTo(canvas.width, canvas.height / 2);
    canvasCtx.stroke();

  };

  draw();

}

loadVideo();

<canvas id="canvas" width="500" height="200"></canvas>
<button id="rec" disabled>record</button>


Ps : Since FF team seems to take some time to fix the bug, here is a quick fix to make it work on FF too.

You can also mix two tracks by using new MediaStream([track1, track2]).
However, chrome currently prefixes this constructor, but since it does support addTrack, it's not really needed, and we can come with something as ugly as

var mixedStream = 'MediaStream' in window ? 
  new MediaStream([cStream.getVideoTracks()[0], aStream.getAudioTracks()[0]]) :
  cStream;
recorder = new MediaRecorder(mixedStream);

Working fiddle for both FF and chrome.

这篇关于MediaStream捕获画布和音频同时的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆