将本地 mjpg 视频流式传输到 html 画布 [英] Stream local mjpg video to html canvas

查看:109
本文介绍了将本地 mjpg 视频流式传输到 html 画布的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试将 mjpg 视频的实时流写入 html 画布.

I am trying to write a live stream of a mjpg video to an html canvas.

以下内容:http://camelive.info/ 有一个带有 mjpeg 视频的公共网络摄像头列表,但它们似乎在写 <框架集 > 带有框架元素的标签,我不知道它是如何在小提琴中工作的.

The following: http://camelive.info/ has a list of public webcams with mjpeg videos but they seem to be writing < frameset > tags with frame elements and I can't pick up how its working in a fiddle.

理想的解决方案是在小提琴的 html 画布上流式传输任何实时 mjpg(理想情况下是链接?).

The ideal solution has any live mjpg (ideally a link?) streaming on an html canvas in fiddle.

感谢任何有用的资源,我想这样做而不包括外部库(允许使用 jquery)

Any helpful resources are appreciated, I would like to do this without including external libraries (jquery allowed)

相关:如何从 HTML 格式的 MJPEG 流制作快照

我也有一个本地 mjpg 可以像示例一样绘制.解决方案可以使用本地流

I too have a local mjpg to draw from like the example. Solution can use local stream

推荐答案

根据 到规范关于 CanvasRenderingContext2D drawImage 方法,

According to specs about the CanvasRenderingContext2D drawImage method,

具体来说,当 CanvasImageSource 对象表示动画HTMLImageElement 中的图像,用户代理必须使用默认的动画的图像(格式定义的那个是要使用的当不支持或禁用动画时),或者,如果没有这样的图像,动画的第一帧,渲染图像时用于 CanvasRenderingContext2D API.

Specifically, when a CanvasImageSource object represents an animated image in an HTMLImageElement, the user agent must use the default image of the animation (the one that the format defines is to be used when animation is not supported or is disabled), or, if there is no such image, the first frame of the animation, when rendering the image for CanvasRenderingContext2D APIs.

这适用于 .gif、SMIL 动画 .svg.mjpeg 媒体.所以一旦你获取了数据,就应该只在画布上绘制一帧.

This applies to .gif, SMIL animated .svg and .mjpeg media. So once you fetched the data, only one frame should be drawn onto the canvas.

请注意,chrome 有一个错误,并且只尊重它.gif 图像,但他们可能有一天会修复它.

Note that chrome has a bug and only respect it for .gif images, but they may fix it someday.

您自己注意到的一个解决方案是使用清除缓存黑客获取另一个新框架 ('your.url/?' + new Date().getTime();)但是您将失去 mjpeg 格式(部分帧内容)的任何优势,并且无法确定刷新何时发生.

One solution as you noticed yourself, is to fetch an other fresh frame, with the clear-cache hack ('your.url/?' + new Date().getTime();) but you will loose any advantages of the mjpeg format (partial frame content) and can't be sure when the refreshing will happen.

因此,如果适用,更好的解决方案是使用视频格式.视频的每一帧都可以绘制到画布上.

So a better solution if applicable, would be to use a video format. Each frame of a video can be drawn to the canvas.

编辑 2018

两年后我想到了第三个解决方案:

A third solution came to my little mind two years later:

UA 不依赖于在内存中为文档中的所有 2DContext 保留相同的默认图像.
而对于其他格式,我们仍然有点卡住,对于没有明确定义的默认图像的 MJPEG 流,我们实际上落在了动画的第一帧.

UAs are not tied to keep in memory the same default image for all 2DContexts in the document.
While for others format we are still kinda stuck, for MJPEG streams, which don't have a well defined default image, we actually fall to the first frame of the animation.

因此,通过在不同的时间在两个不同的画布上绘制包含我们的 MJPEG 流的 ,理论上我们可以在画布上绘制同一 MJPEG 流的两个不同帧.

So by drawing the <img> containing our MJPEG stream on two different canvases, at different times, we can theoretically have two different frames of our same MJPEG stream to be drawn on the canvases.

这是仅在 Firefox 62 上测试的概念证明.

Here is a proof of concept only tested on Firefox 62.

var ctx_stream = stream.getContext('2d');
var ctx_direct = direct.getContext('2d');
img.onload = function() {
   stream.width = direct.width = this.naturalWidth;
   stream.height = direct.height = this.naturalHeight;
   // onload should fire multiple times
   // but it seems it's not at every frames
   // so we'll disable t and use an interval instead
   this.onload = null;
   setInterval(draw, 500);
};
function draw() {
  // create a *new* 2DContext
  var ctx_off = stream.cloneNode().getContext('2d');
  ctx_off.drawImage(img, 0,0);
  // and draw it back to our visible one
  ctx_stream.drawImage(ctx_off.canvas, 0,0);
  
  // draw the img directly on 'direct'
  ctx_direct.drawImage(img, 0,0);
}
  
  
img.src = "http://webcam.st-malo.com/axis-cgi/mjpg/video.cgi?resolution=704x576&dummy=1491717369754";

canvas,img{
  max-height: 75vh;
}

Using a new offcreen canvas every frame: <br><canvas id="stream"></canvas><br>
The original image: <br><img id="img"><br>
Drawing directly the &lt;img> (if this works your browser doesn't follow the specs): <br><canvas id="direct"></canvas><br>

因此,虽然此解决方案显然会对性能产生影响(我们每帧都创建一个全新的画布元素及其 2DContext),但它仍然可能比淹没网络要好.无论如何,所有这些都应该很容易被垃圾收集.

So while this solution will obviously come with a performance impact (we are creating a whole new canvas element and its 2DContext every frame), it's still probably better than flooding the network. And all this should be Garbage Collected quite easily anyway.

这篇关于将本地 mjpg 视频流式传输到 html 画布的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆