MediaStream 同时捕获画布和音频 [英] MediaStream Capture Canvas and Audio Simultaneously

查看:33
本文介绍了MediaStream 同时捕获画布和音频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在做一个我想参与的项目:

  1. 加载视频 js 并将其显示在画布上.
  2. 使用过滤器来改变画布(以及视频)的外观.
  3. 使用 MediaStream captureStream() 方法和 MediaRecorder 对象记录画布表面和原始视频的音频.
  4. 在 HTML 视频元素中同时播放画布和音频流.

通过调整此 WebRTC 演示代码,我已经能够在视频元素中显示画布记录:https://webrtc.github.io/samples/src/content/capture/canvas-record/

也就是说,我不知道如何在画布旁边录制视频的音频.是否可以创建一个包含来自两个不同源/元素的 MediaStreamTrack 实例的 MediaStream?

根据 MediaStream API 的规范,理论上应该有一些方法可以实现这一点:https://w3c.github.io/mediacapture-main/#introduction>

MediaStream API 中的两个主要组件是 MediaStreamTrack 和 MediaStream 接口.MediaStreamTrack 对象表示源自用户代理中的一个媒体源的单一类型的媒体,例如由网络摄像机生成的视频.MediaStream 用于将多个 MediaStreamTrack 对象分组为一个单元,该单元可以在媒体元素中记录或呈现."

解决方案

是否可以创建一个包含来自两个不同源/元素的 MediaStreamTrack 实例的 MediaStream?

是的,您可以使用 MediaStream.addTrack() 方法.

但 Firefox 只会使用初始流的轨道进入记录器,直到 这个错误 已修复.

<小时>

OP 已经知道如何获取所有内容,但这里提醒未来的读者:

  • 要从画布获取 videoStream 轨道,您可以调用 canvas.captureStream(framerate) 方法.

  • 要从视频元素获取音频流跟踪,您可以使用 WebAudio API,它是 createMediaStreamDestination 方法.这将返回一个包含我们的音频流的 MediaStreamDestination 节点 (dest).然后你必须连接一个 MediaElementSource你的视频元素,到这个 dest.如果您需要向此流添加更多音轨,您应该将所有这些源连接到 dest.

现在我们有两个流,一个用于画布视频,一个用于音频,我们可以在初始化之前使用 canvasStream.addTrack(audioStream.getAudioTracks()[0])new MediaRecorder(canvasStream).

这是一个完整的例子,它现在只能在 chrome 中工作,可能很快在 Firefox 中工作,当他们修复错误时:

var cStream,一个流,视频,录音机,分析仪,数据数组,缓冲区长度,块 = [];函数 clickHandler() {this.textContent = '停止录音';cStream = canvas.captureStream(30);cStream.addTrack(aStream.getAudioTracks()[0]);记录器 = 新的 MediaRecorder(cStream);录音机开始();recorder.ondataavailable = saveChunks;recorder.onstop = exportStream;this.onclick = 停止录音;};函数 exportStream(e) {如果(块.长度){var blob = 新 Blob(块)var vidURL = URL.createObjectURL(blob);var vid = document.createElement('video');vid.controls = true;vid.src = vidURL;vid.onend = 函数(){URL.revokeObjectURL(vidURL);}document.body.insertBefore(vid, canvas);} 别的 {document.body.insertBefore(document.createTextNode('没有保存数据'), canvas);}}函数 saveChunks(e) {e.data.size &&chunks.push(e.data);}功能停止录音(){视频暂停();this.parentNode.removeChild(this);recorder.stop();}函数 initAudioStream(evt) {var audioCtx = new AudioContext();//从我们的 AudioContext 创建一个流var dest = audioCtx.createMediaStreamDestination();aStream = dest.stream;//将我们的视频元素的输出连接到流var sourceNode = audioCtx.createMediaElementSource(this);sourceNode.connect(dest)//开始视频this.play();//仅用于精美的画布绘图分析器 = audioCtx.createAnalyser();sourceNode.connect(分析器);分析器.fftSize = 2048;bufferLength = analyzer.frequencyBinCount;dataArray = new Uint8Array(bufferLength);analyzer.getByteTimeDomainData(dataArray);//输出到我们的耳机sourceNode.connect(audioCtx.destination)开始画布动画();rec.onclick = clickHandler;rec.disabled = false;};var loadVideo = function() {vid = document.createElement('视频');vid.crossOrigin = '匿名';vid.oncanplay = initAudioStream;vid.src = 'https://dl.dropboxusercontent.com/s/bch2j17v6ny4ako/movie720p.mp4';}函数 startCanvasAnim() {//来自 MDN https://developer.mozilla.org/en/docs/Web/API/AnalyserNode#Examplesvar canvasCtx = canvas.getContext('2d');canvasCtx.fillStyle = 'rgb(200, 200, 200)';canvasCtx.lineWidth = 2;canvasCtx.strokeStyle = 'rgb(0, 0, 0)';var draw = function() {var drawVisual = requestAnimationFrame(draw);analyzer.getByteTimeDomainData(dataArray);canvasCtx.fillRect(0, 0, canvas.width, canvas.height);canvasCtx.beginPath();var sliceWidth = canvas.width * 1.0/bufferLength;无功x = 0;for (var i = 0; i < bufferLength; i++) {var v = dataArray[i]/128.0;var y = v * canvas.height/2;如果(我 === 0){canvasCtx.moveTo(x, y);} 别的 {canvasCtx.lineTo(x, y);}x += 切片宽度;}canvasCtx.lineTo(canvas.width, canvas.height/2);canvasCtx.stroke();};画();}loadVideo();

<button id="rec" disabled>record</button>

<小时>

Ps :由于 FF 团队似乎需要一些时间来修复该错误,这里有一个快速修复程序,使其也适用于 FF.

您还可以使用 new MediaStream([track1, track2]) 混合两个轨道.
然而,chrome 目前在这个构造函数前面加上了前缀,但是因为它确实支持 addTrack,所以它并不是真正需要的,我们可以用像

这样丑陋的东西来

var mixedStream = 'MediaStream' in window ?new MediaStream([cStream.getVideoTracks()[0], aStream.getAudioTracks()[0]]) :cStream;记录器 = 新媒体记录器(混合流);

FF 和 chrome 的工作小提琴.

I'm working on a project in which I'd like to:

  1. Load a video js and display it on the canvas.
  2. Use filters to alter the appearance of the canvas (and therefore the video).
  3. Use the MediaStream captureStream() method and a MediaRecorder object to record the surface of the canvas and the audio of the original video.
  4. Play the stream of both the canvas and the audio in an HTML video element.

I've been able to display the canvas recording in a video element by tweaking this WebRTC demo code: https://webrtc.github.io/samples/src/content/capture/canvas-record/

That said, I can't figure out how to record the video's audio alongside the canvas. Is it possible to create a MediaStream containing MediaStreamTrack instances from two different sources/elements?

According to the MediaStream API's specs there should theoretically be some way to accomplish this: https://w3c.github.io/mediacapture-main/#introduction

"The two main components in the MediaStream API are the MediaStreamTrack and MediaStream interfaces. The MediaStreamTrack object represents media of a single type that originates from one media source in the User Agent, e.g. video produced by a web camera. A MediaStream is used to group several MediaStreamTrack objects into one unit that can be recorded or rendered in a media element."

解决方案

Is it possible to create a MediaStream containing MediaStreamTrack instances from two different sources/elements?

Yes, you can do it using the MediaStream.addTrack() method.

But Firefox will only use the initial stream's tracks into the Recorder until this bug has been fixed.


OP already known how to get all of it, but here is a reminder for future readers :

  • To get a videoStream track from the canvas, you can call canvas.captureStream(framerate) method.

  • To get an audio streamTrack from a video element you can use the WebAudio API and it's createMediaStreamDestination method. This will return a MediaStreamDestination node (dest) containing our audioStream. You'll then have to connect a MediaElementSource created from your video element, to this dest. If you need to add more audio tracks to this stream, you should connect all these sources to dest.

Now that we've got two streams, one for the canvas video and one for the audio, we can use canvasStream.addTrack(audioStream.getAudioTracks()[0]) just before initializing our new MediaRecorder(canvasStream).

Here is a complete example, that will work only in chrome now, and probably soon in Firefox, when they will have fixed the bug :

var cStream,
  aStream,
  vid,
  recorder,
  analyser,
  dataArray,
  bufferLength,
  chunks = [];

function clickHandler() {

  this.textContent = 'stop recording';
  cStream = canvas.captureStream(30);
  cStream.addTrack(aStream.getAudioTracks()[0]);

  recorder = new MediaRecorder(cStream);
  recorder.start();

  recorder.ondataavailable = saveChunks;

  recorder.onstop = exportStream;

  this.onclick = stopRecording;

};

function exportStream(e) {

  if (chunks.length) {

    var blob = new Blob(chunks)
    var vidURL = URL.createObjectURL(blob);
    var vid = document.createElement('video');
    vid.controls = true;
    vid.src = vidURL;
    vid.onend = function() {
      URL.revokeObjectURL(vidURL);
    }
    document.body.insertBefore(vid, canvas);

  } else {

    document.body.insertBefore(document.createTextNode('no data saved'), canvas);

  }
}

function saveChunks(e) {

  e.data.size && chunks.push(e.data);

}

function stopRecording() {

  vid.pause();
  this.parentNode.removeChild(this);
  recorder.stop();

}

function initAudioStream(evt) {

  var audioCtx = new AudioContext();
  // create a stream from our AudioContext
  var dest = audioCtx.createMediaStreamDestination();
  aStream = dest.stream;
  // connect our video element's output to the stream
  var sourceNode = audioCtx.createMediaElementSource(this);
  sourceNode.connect(dest)
    // start the video
  this.play();

  // just for the fancy canvas drawings
  analyser = audioCtx.createAnalyser();
  sourceNode.connect(analyser);

  analyser.fftSize = 2048;
  bufferLength = analyser.frequencyBinCount;
  dataArray = new Uint8Array(bufferLength);
  analyser.getByteTimeDomainData(dataArray);

  // output to our headphones
  sourceNode.connect(audioCtx.destination)

  startCanvasAnim();

  rec.onclick = clickHandler;
  rec.disabled = false;
};

var loadVideo = function() {

  vid = document.createElement('video');
  vid.crossOrigin = 'anonymous';
  vid.oncanplay = initAudioStream;
  vid.src = 'https://dl.dropboxusercontent.com/s/bch2j17v6ny4ako/movie720p.mp4';

  
}

function startCanvasAnim() {
  // from MDN https://developer.mozilla.org/en/docs/Web/API/AnalyserNode#Examples
  var canvasCtx = canvas.getContext('2d');

  canvasCtx.fillStyle = 'rgb(200, 200, 200)';
  canvasCtx.lineWidth = 2;
  canvasCtx.strokeStyle = 'rgb(0, 0, 0)';

  var draw = function() {

    var drawVisual = requestAnimationFrame(draw);

    analyser.getByteTimeDomainData(dataArray);

    canvasCtx.fillRect(0, 0, canvas.width, canvas.height);
    canvasCtx.beginPath();

    var sliceWidth = canvas.width * 1.0 / bufferLength;
    var x = 0;

    for (var i = 0; i < bufferLength; i++) {

      var v = dataArray[i] / 128.0;
      var y = v * canvas.height / 2;

      if (i === 0) {
        canvasCtx.moveTo(x, y);
      } else {
        canvasCtx.lineTo(x, y);
      }

      x += sliceWidth;
    }

    canvasCtx.lineTo(canvas.width, canvas.height / 2);
    canvasCtx.stroke();

  };

  draw();

}

loadVideo();

<canvas id="canvas" width="500" height="200"></canvas>
<button id="rec" disabled>record</button>


Ps : Since FF team seems to take some time to fix the bug, here is a quick fix to make it work on FF too.

You can also mix two tracks by using new MediaStream([track1, track2]).
However, chrome currently prefixes this constructor, but since it does support addTrack, it's not really needed, and we can come with something as ugly as

var mixedStream = 'MediaStream' in window ? 
  new MediaStream([cStream.getVideoTracks()[0], aStream.getAudioTracks()[0]]) :
  cStream;
recorder = new MediaRecorder(mixedStream);

Working fiddle for both FF and chrome.

这篇关于MediaStream 同时捕获画布和音频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆