是否可以通过管道发送ffmpeg图像? [英] is it possible to send ffmpeg images by using pipe?

查看:242
本文介绍了是否可以通过管道发送ffmpeg图像?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想将图像作为输入发送到ffmpeg,并且我希望ffmpeg将视频输出到流(webRtc格式.)

I want to send images as input to ffmpeg and I want ffmpeg to output video to a stream (webRtc format.)

我发现一些信息表明,根据我的理解,这是可能的. -我相信ffmpeg可以从管道接收图像,有人知道该怎么做吗?

I found some information that from my understanding showed this is possible. - I believe that ffmpeg could receive image from a pipe, does anyone know how this can be done ?

推荐答案

我想将图像作为输入发送到FFmpeg ...我相信FFmpeg可以从管道中接收图像,有人知道怎么做吗?完成吗?"

"I want to send images as input to FFmpeg... I believe that FFmpeg could receive image from a pipe, does anyone know how this can be done?"

是的,可以使用管道发送FFmpeg图像.使用standardInput发送帧.帧数据必须是字节数组中未压缩的像素值(例如:24位RGB格式),该字节数组必须具有足够的字节(width x height x 3)来写入完整的帧.

Yes it's possible to send FFmpeg images by using a pipe. Use the standardInput to send frames. The frame data must be uncompressed pixel values (eg: 24bit RGB format) in a byte array that holds enough bytes (widthxheightx3) to write a full frame.

通常(在命令"或终端"窗口中),您将输入和输出设置为:

Normally (in Command or Terminal window) you set input and output as:

ffmpeg -i inputvid.mp4 outputvid.mp4.

但是对于管道,您必须首先指定输入输入的宽度/高度和帧频等.然后将输入输入文件名添加为-i -(其中使用空白的-表示FFmpeg监视standardInput连接以获取传入的原始像素数据.

But for pipes you must first specify the incoming input's width/height and frame rate etc. Then aso add incoming input filename as -i - (where by using a blank - this means FFmpeg watches the standardInput connection for incoming raw pixel data.

必须将帧数据放入某些Bitmap对象中,并将位图值作为字节数组发送.每次发送将被编码为一个新的视频帧.伪代码示例:

You must put your frame data into some Bitmap object and send the bitmap values as byte array. Each send will be encoded as a new video frame. Example pseudo-code :

public function makeVideoFrame ( frame_BMP:Bitmap ) : void
{
    //# Encodes the byte array of a Bitmap object as FFmpeg video frame
    if ( myProcess.running == true )
    {
        Frame_Bytes = frame_BMP.getBytes(); //# read pixel values to a byte array
        myProcess.standardInput.writeBytes(Frame_Bytes); //# Send data to FFmpeg for new frame encode

        Frame_Bytes.clear(); //# empty byte array for re-use with next frame

    }
}

每次使用新的像素信息更新位图时,都可以通过将位图作为输入参数发送到上述函数(例如makeVideoFrame (my_new_frame_BMP);)来将其写为新帧.

Anytime you update your bitmap with new pixel information, you can write that as a new frame by sending that bitmap as input parameter to the above function eg makeVideoFrame (my_new_frame_BMP);.

您的管道的流程必须以以下参数开头:

Your pipe's Process must start with these arguments:

-y -f rawvideo -pix_fmt argb -s 800x600 -r 25 -i - ....etc

哪里...

  • -f rawvideo -pix_fmt argb表示接受未压缩 RGB数据.

  • -f rawvideo -pix_fmt argb means accept uncompressed RGB data.

-s 800x600-r 25是示例输入 width & height r设置 frame rate ,这意味着FFmpeg必须在每秒输出视频中编码此数量的图像.

-s 800x600 and -r 25 are example input width & height, r sets frame rate meaning FFmpeg must encode this amount of images per one second of output video.

完整的设置如下:

-y -f rawvideo -pix_fmt argb -s 800x600 -r 25 -i - -c:v libx264 -profile:v baseline -level:v 3 -b:v 2500 -an out_vid.h264

如果您获得块状视频输出,请尝试设置两个输出文件...

If you get blocky video output try setting two output files...

-y -f rawvideo -pix_fmt argb -s 800x600 -r 25 -i - -c:v libx264 -profile:v baseline -level:v 3 -b:v 2500 -an out_tempData.h264 out_vid.h264

这将输出一个测试h264视频文件,您以后可以将其放入MP4容器中.
音轨-i someTrack.mp3是可选的.

This will output a test h264 video file which you can later put inside an MP4 container.
The audio track -i someTrack.mp3 is optional.

-i myH264vid.h264 -i someTrack.mp3 outputVid.mp4

这篇关于是否可以通过管道发送ffmpeg图像?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆