对于 html 视频元素,是否可以将流作为源添加到 html canvas 元素? [英] Is it possible to add a stream as source to an html canvas element as to a html video element?

查看:61
本文介绍了对于 html 视频元素,是否可以将流作为源添加到 html canvas 元素?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

根据 MDN:

HTMLMediaElementinterface 添加到 HTMLElement 属性和支持基本媒体相关功能所需的方法音频和视频通用.

The HTMLMediaElement interface adds to HTMLElement the properties and methods needed to support basic media-related capabilities that are common to audio and video.

HTMLMediaElement.captureStream().它可以与 元素一起使用以捕获其流.

HTMLMediaElement.captureStream(). It can be used with both <video> and <canvas> elements to capture their stream.

相反,您可以将 视频流 作为 srcObject 添加到 元素,然后显示它. 元素也可以吗?

Conversely, one can add a video stream as srcObject to a <video> element, then it shows it. Is it possible for <canvas> element too?

是否可以将流添加作为到 html 元素?

Is it possible to add a stream as source to an html <canvas> element?

推荐答案

没有任何 Canvas API 能够使用 MediaStream.

No there is nothing in any of the Canvas APIs able to consume a MediaStream.

画布 API 仅适用于原始像素,并且不包含任何类型的解码器.您必须使用能够进行此解码的 javascript 对象(例如 ImageBitmap)或 HTMLElements.

The canvas APIs work only with raw pixels, and contain no decoder of any sort. You must use either either javascript objects that are able to do this decode (e.g ImageBitmap), or HTMLElements.

因此,在 MediaStream 的情况下,目前唯一能够解码其视频内容的对象将是 HTMLVideoElement,您将能够轻松在画布上绘图.

So in the case of a MediaStream, currently the only object able to decode it's video content will be an HTMLVideoElement, that you'll be able to draw on your canvas easily.

WebCodecs API 最近取得了很大进展,并且越来越成熟值得一提的解决方案.

The WebCodecs API has made great progress recently, and is becoming more and more mature that it's now worth being mentioned as a solution.

此 API 提供了一个名为 VideoFrame 的新界面,它将很快成为CanvasImageSources 类型,意思是,我们可以直接使用它与drawImagetexImage2D 和任何这样的CanvasImageSource 可以使用.
MediaCapture Transform W3C 小组开发了一个 MediaStreamTrackProcessor 确实从视频 MediaStreamTrack 返回此类 VideoFrame.

This API offers a new interface called VideoFrame which will soon be part of the CanvasImageSources type, meaning, we can use it directly with drawImage, texImage2D and everywhere such a CanvasImageSource can be used.
The MediaCapture Transform W3C group has developed a MediaStreamTrackProcessor that does return such VideoFrames from a video MediaStreamTrack.

所以我们现在有一种更直接的方法来将 MediaStream 渲染到画布上,该方法目前仅适用于带有 #enable-experimental-web-platform-features 标志的 Chrome 浏览器...

So we now have a more direct way to render a MediaStream to a canvas, which currently only works in Chrome browser with the #enable-experimental-web-platform-features flag on...

if( window.MediaStreamTrackProcessor ) {
  const canvas = document.querySelector("canvas");
  const ctx = canvas.getContext("2d");
  const track = getCanvasTrack(); // MediaStream.getVideoTracks()[0]
  const processor = new MediaStreamTrackProcessor( track );
  const reader = processor.readable.getReader();
  readChunk();
  function readChunk() {
    reader.read().then( ({ done, value }) => {
      // the MediaStream video can have dynamic size
      if( canvas.width !== value.displayWidth || canvas.height !== value.displayHeight ) {
        canvas.width = value.displayWidth;
        canvas.height = value.displayHeight;
      }
      ctx.clearRect( 0, 0, canvas.width, canvas.height );
      // value is a VideoFrame
      ctx.drawImage( value, 0, 0 );
      value.close(); // close the VideoFrame when we're done with it
      if( !done ) {
        readChunk();
      }
    });
  }
}
else {
  console.error("Your browser doesn't support this API yet");
}

// We can't use getUserMedia in StackSnippets
// So here we use a simple canvas as source
// for our MediaStream.
function getCanvasTrack() {
  // just some noise...
  const canvas = document.createElement("canvas");
  const ctx = canvas.getContext("2d");
  const img = new ImageData(300, 150);
  const data = new Uint32Array(img.data.buffer);
  const track = canvas.captureStream().getVideoTracks()[0];

  anim();
  
  return track;
  
  function anim() {
    for( let i=0; i<data.length;i++ ) {
      data[i] = Math.random() * 0xFFFFFF + 0xFF000000;
    }
    ctx.putImageData(img, 0, 0);
    if( track.readyState === "live" ) {
      requestAnimationFrame(anim);
    }
  }
  
}

<canvas></canvas>

作为 glitch 项目(source) 使用相机作为源.

As a glitch project (source) using the camera as source.

这篇关于对于 html 视频元素,是否可以将流作为源添加到 html canvas 元素?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆