ffmpeg:使用NodeJS从stdin渲染webm [英] ffmpeg: Render webm from stdin using NodeJS

查看:213
本文介绍了ffmpeg:使用NodeJS从stdin渲染webm的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在尝试将动态创建的一些jpeg帧转储到ffmpeg和NodeJS以便创建webm视频时遇到问题。

I'm having an issue trying to dump some jpeg frames created on the fly to ffmpeg and NodeJS in order to create a webm video.

脚本试图做这些事情:


  • 在初始化时分叉新的ffmpeg进程

  • 渲染画布

  • 更新画布中的数据后,从中获取JPEG数据。

  • 将JPEG数据插入ffmpeg stdin中。

  • ffmpeg负责将其附加到webm视频文件上。

  • 这种情况永远存在,并且ffmpeg永不停止

  • Fork a new ffmpeg process on initialization
  • Render a canvas
  • Once the data in canvas is updated, grab JPEG data from it.
  • Pipe the JPEG data into the ffmpeg stdin.
  • ffmpeg takes care of appending it on a webm video file.
  • and this goes forever and ffmpeg should never stop

应该是一个不断增长的视频,可以向所有连接的客户端进行实时广播,但是我得到的结果只是一个单幅webm。

It should be an always growing video to be broadcast live to all connected clients, but the result that I get is just a single frame webm.

这是ffmpeg分支

var args = '-f image2pipe -r 15 -vcodec mjpeg -s 160x144 -i - -f webm -r 15 test.webm'.split(' ');
var encoder = spawn('ffmpeg', args);
encoder.stderr.pipe(process.stdout);

这是画布更新和管道

theCanvas.on('draw', function () {
    var readStream = self.canvas.jpegStream();
    readStream.pipe(self.encoder.stdin);
});

ffmpeg输出

ffmpeg version 1.2.6-7:1.2.6-1~trusty1 Copyright (c) 2000-2014 the FFmpeg developers
  built on Apr 26 2014 18:52:58 with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1)
  configuration: --arch=amd64 --disable-stripping --enable-avresample --enable-pthreads --enable-runtime-cpudetect --extra-version='7:1.2.6-1~trusty1' --libdir=/usr/lib/x86_64-linux-gnu --prefix=/usr --enable-bzlib --enable-libdc1394 --enable-libfreetype --enable-frei0r --enable-gnutls --enable-libgsm --enable-libmp3lame --enable-librtmp --enable-libopencv --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-vaapi --enable-vdpau --enable-libvorbis --enable-libvpx --enable-zlib --enable-gpl --enable-postproc --enable-libcdio --enable-x11grab --enable-libx264 --shlibdir=/usr/lib/x86_64-linux-gnu --enable-shared --disable-static
  libavutil      52. 18.100 / 52. 18.100
  libavcodec     54. 92.100 / 54. 92.100
  libavformat    54. 63.104 / 54. 63.104
  libavdevice    53.  5.103 / 53.  5.103
  libavfilter     3. 42.103 /  3. 42.103
  libswscale      2.  2.100 /  2.  2.100
  libswresample   0. 17.102 /  0. 17.102
  libpostproc    52.  2.100 / 52.  2.100
[image2pipe @ 0xee0740] Estimating duration from bitrate, this may be inaccurate
Input #0, image2pipe, from 'pipe:':
  Duration: N/A, bitrate: N/A
    Stream #0:0: Video: mjpeg, yuvj420p, 160x144 [SAR 1:1 DAR 10:9], 15 tbr, 15 tbn, 15 tbc
[libvpx @ 0xec5d00] v1.3.0
Output #0, webm, to 'test.webm':
  Metadata:
    encoder         : Lavf54.63.104
    Stream #0:0: Video: vp8, yuv420p, 160x144 [SAR 1:1 DAR 10:9], q=-1--1, 200 kb/s, 1k tbn, 15 tbc
Stream mapping:
  Stream #0:0 -> #0:0 (mjpeg -> libvpx)
pipe:: Input/output error
frame=    1 fps=0.0 q=0.0 Lsize=      12kB time=00:00:00.06 bitrate=1441.1kbits/s    
video:11kB audio:0kB subtitle:0 global headers:0kB muxing overhead 4.195804%

我该怎么办?

谢谢,
Vinicius

Thanks, Vinicius

推荐答案

发送第一帧数据后,管道将关闭。我遇到了类似的问题,这使我得以部分解决。希望这会有所帮助,并且我还不算太晚。

The pipe is being closed after the first frame of data is sent. I have been having a similar problem and this got me part way to fixing it. Hope this helps and that I'm not too late.

theCanvas.on('draw', function () {
    var readStream = self.canvas.jpegStream();
    readStream.pipe(self.encoder.stdin, {end:false});
});

这篇关于ffmpeg:使用NodeJS从stdin渲染webm的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆