通过GStreamer的-1.0 appsrc RTMP流至rtmpsink [英] Rtmp streaming via gstreamer-1.0 appsrc to rtmpsink

查看:4631
本文介绍了通过GStreamer的-1.0 appsrc RTMP流至rtmpsink的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图通过RTMP流我的摄像头。我试图通过流以下管线数据:


  

GST-推出-1.0 -v v4l2src! 视频/ X-原料,宽度= 640,高度= 480,
  帧率=1分之30'!排队! videoconvert! omxh264enc! h264parse!
  flvmux! rtmpsink位置='RTMP:// {} MY_IP / RTMP /活


和它的工作原理就像一个魅力。我可以看到我的网站上的视频。

然后我想捕捉帧首先,并做一些处理。
我流推数据转换成appsrc并通过管道像以前一样流处理的数据,但有些问题发生。

我看不到我的网站上的流媒体。包括服务器端和客户端没有提出任何错误或警告。尽管如此,我还是能获得通过流:


  

GST-推出-1.0 rtmpsrc位置='RTMP:// {} MY_IP / RTMP /活!文件接收
  位置='rtmpsrca.flv


有没有人对此有任何想法?

下面是我的网站的一部分,GStreamer的管道

的片段

GStreamer的管道:

\r
\r

无效threadgst(){\r
\r
    应用*应用程序=安培; s_app;\r
    GstCaps * srccap;\r
    GstCaps * filtercap;\r
    GstFlowReturn RET;\r
    GstBus *总线;\r
    GstElement *管道;\r
\r
    gst_init(NULL,NULL);\r
\r
    循环= g_main_loop_new(NULL,TRUE);\r
\r
    // creazione德拉管道:\r
    管道= gst_pipeline_new(GStreamer的恩codeR);\r
    如果(!管线){\r
        了g_print(错误创建管道,退出...);\r
    }\r
\r
    // creazione ELEMENTO appsrc:\r
    APP-GT&; videosrc = gst_element_factory_make(appsrc,videosrc);\r
    如果(APP-GT&;!videosrc){\r
            了g_print(错误创建源元素,退出...);\r
    }\r
\r
    // creazione ELEMENTO队列:\r
    APP-GT&;队列= gst_element_factory_make(排队,排队);\r
    如果(APP-GT&;!队列){\r
            了g_print(错误创建队列元素,退出...);\r
    }\r
\r
    APP-> videocoverter = gst_element_factory_make(videoconvert,videocoverter);\r
    如果(APP-GT&;!videocoverter){\r
            了g_print(错误创建videocoverter,退出...);\r
    }\r
\r
    // creazione ELEMENTO过滤器:\r
    APP-过滤器过滤= gst_element_factory_make(capsfilter,过滤器);\r
    如果(APP-GT&;!过滤器){\r
            了g_print(错误创建过滤器,退出...);\r
    }\r
\r
    APP-> h264enc = gst_element_factory_make(omxh264enc,h264enc);\r
    如果(APP-GT&;!h264enc){\r
            了g_print(错误创建omxh264enc,退出...);\r
    }\r
\r
 APP-> h264parse = gst_element_factory_make(h264parse,h264parse);\r
    如果(APP-GT&;!h264parse){\r
            了g_print(错误创建h264parse,退出...);\r
    }\r
    APP-> flvmux = gst_element_factory_make(flvmux,flvmux);\r
    如果(APP-GT&;!flvmux){\r
            了g_print(错误创建flvmux,退出...);\r
    }\r
    APP-> rtmpsink = gst_element_factory_make(rtmpsink,rtmpsink);\r
    如果(APP-GT&;!rtmpsink){\r
            了g_print(错误rtmpsink flvmux,退出...);\r
    }\r
\r
\r
\r
    了g_print(元素创建\\ n);\r
    g_object_set(G_OBJECT(APP-> rtmpsink),位置,设rtmp://192.168.3.107/rtmp/live活= 1,NULL);\r
    \r
\r
\r
    了g_print(设置结束的\\ n);\r
\r
    srccap = gst_caps_new_simple(视频/ X-原始,\r
            格式化,G_TYPE_STRING,RGB,\r
            宽,G_TYPE_INT,640,\r
            身高,G_TYPE_INT,480,\r
            //宽度,G_TYPE_INT,320,\r
            //高度,G_TYPE_INT,240,\r
            帧率,GST_TYPE_FRACTION,30,1,\r
            //像素纵横比,GST_TYPE_FRACTION,1,1,\r
        空值);\r
\r
    filtercap = gst_caps_new_simple(视频/ X-原始,\r
            格式化,G_TYPE_STRING,I420\r
            宽,G_TYPE_INT,640,\r
            身高,G_TYPE_INT,480,\r
            //宽度,G_TYPE_INT,320,\r
            //高度,G_TYPE_INT,240,\r
            帧率,GST_TYPE_FRACTION,30,1,\r
        空值);\r
\r
    gst_app_src_set_caps(GST_APP_SRC(APP-GT&; videosrc),srccap);\r
    g_object_set(G_OBJECT(APP-过滤器过滤),帽子,filtercap,NULL);\r
    总线= gst_pipeline_get_bus(GST_PIPELINE(管道));\r
    g_assert(巴士);\r
    gst_bus_add_watch(公共汽车,(GstBusFunc)bu​​s_call,应用);\r
\r
 gst_bin_add_many(GST_BIN(管道),APP-> videosrc,APP->队列,APP-> videocoverter,APP-过滤器过滤,APP-> h264enc,APP-> h264parse,APP-> flvmux,应用 - > rtmpsink,NULL);\r
\r
    了g_print(补充说:所有的元素进入管道\\ n);\r
\r
    诠释OK = FALSE;\r
    确定= gst_element_link_many(APP-> videosrc,APP->队列,APP-> videocoverter,APP-过滤器过滤,APP-> h264enc,APP-> h264parse,APP-> flvmux,APP-> rtmpsink,NULL);\r
\r
\r
    如果(OK)了g_print(链接所有的元素放在一起\\ n);\r
    别的了g_print(***链接错误*** \\ n);\r
\r
    g_assert(APP-GT&; videosrc);\r
    g_assert(GST_IS_APP_SRC(APP-> videosrc));\r
\r
    g_signal_connect(APP-GT&; videosrc,需要的数据,G_CALLBACK(START_FEED),应用程序);\r
    g_signal_connect(APP-GT&; videosrc,足够数据,G_CALLBACK(stop_feed),应用程序);\r
\r
\r
    了g_print(播放视频\\ n);\r
    gst_element_set_state(管道,GST_STATE_PLAYING);\r
\r
    了g_print(运行... \\ n);\r
        g_main_loop_run(环);\r
\r
    了g_print(返回,停止播放\\ n);\r
    gst_element_set_state(管道,GST_STATE_NULL);\r
    gst_object_unref(巴士);\r
    g_main_loop_unref(环);\r
    了g_print(删除管道\\ n);\r
\r
\r
}

\r

\r
\r

我的网页中的来源

\r
\r

<!DOCTYPE HTML>\r
<元CONTENT =text / html的;字符集= UTF-8HTTP的当量=Content-Type的>\r
<元CONTENT =UTF-8HTTP的当量=编码>\r
< HTML和GT;\r
< HEAD>\r
<标题>直播< /标题>\r
\r
&所述;! - 选通 - >\r
<脚本类型=文/ JavaScript的SRC =频闪/ lib目录/ swfobject.js>< / SCRIPT>\r
<脚本类型=文/ JavaScript的>\r
  VAR参数= {\r
     SRC:RTMP://192.168.3.107/rtmp/live\r
     将autoPlay:真实,\r
     controlBarAutoHide:假的,\r
     playButtonOverlay:真实,\r
     showVideoInfoOverlayOnStartUp:真实,\r
     optimizeBuffering:假的,\r
     initialBufferTime:0.1,\r
     expandedBufferTime:0.1,\r
     minContinuousPlayback:0.1,\r
     //海报:图像/ poster.png\r
  };\r
  swfobject.embedSWF(\r
    频闪/ StrobeMediaPlayback.swf\r
    StrobeMediaPlayback\r
    ,1024\r
    ,768\r
    10.1.0\r
    频闪/ EX pressInstall.swf\r
    ,参数\r
    {\r
      请将allowFullScreen:真\r
    }\r
    {\r
      名称:StrobeMediaPlayback\r
    }\r
  );\r
< / SCRIPT>\r
\r
< /头>\r
<身体GT;\r
< D​​IV ID =StrobeMediaPlayback>< / DIV>\r
< /身体GT;\r
< / HTML>

\r

\r
\r


解决方案

在使用appsrc和appsink人usualy做一些东西的缓冲区,有时他们得到的数据并以某种方式处理它们,然后创建新的缓冲区,但忘了时间戳它正确。

什么是时间戳?
其附加的时间信息到音频/视频缓冲器。
为什么? - 其为每个应用同步机制(VLC,网页..)来显示(present)视频/音频用户在一定的速度一定时间(这是该PTS)..

这事做与帧率(视频)或频率(音频 - 但时间戳不同的工作在这里 - 它不是按通常有4个字节每个音频样本)。

那么,什么可能happend在您的网页面 - 它收到的缓冲区,但没有这个时间戳信息。因此,应用程序不知道如何/时显示视频,这样,它静静地失败了,没有任何显示。

GStreamer的应用程序的工作,因为它显然具有一定的算法是如何猜测帧率等。

就像我说你有两个选择。

1,计算你的PTS和持续时间与yourselve:

  guint64 calculated_pts = some_cool_algorithm();
GstBuffer *缓冲区= gst_buffer_new(数​​据); //你的数据处理
GST_BUFFER_PTS(缓冲器)= calculated_pts; //在纳秒
GST_BUFFER_DURATION(缓冲器)= 1234567890; //在纳秒
//推缓冲区appsrc

2,或者转到做时间戳对appsrc它会自动生成时间戳 - 现在我不知道它是怎么做的 - 它无论是从帽挑帧率或它根据你如何推帧到它产生PTS。

I am trying to stream my webcam via rtmp. I tried to stream data via following pipe line:

gst-launch-1.0 -v v4l2src ! 'video/x-raw, width=640, height=480, framerate=30/1' ! queue ! videoconvert ! omxh264enc ! h264parse ! flvmux ! rtmpsink location='rtmp://{MY_IP}/rtmp/live'

and it works like a charm. I can see the video on my website.

Then I would like to capture frames firstly, and do some process. I streamed processed data by pushing data into appsrc and streaming through pipeline just like before, but some problems occured.

I cannot see any streaming on my website. Both server side and client doesn't raise any error or warning. Nevertheless, I can still acquire streaming by using:

gst-launch-1.0 rtmpsrc location='rtmp://{MY_IP}/rtmp/live' ! filesink location='rtmpsrca.flv'

Does anyone have any idea about this?

Here are the snippets of my website part and gstreamer pipeline

gstreamer pipeline:

void threadgst(){

    App * app = &s_app; 
    GstCaps *srccap;
    GstCaps * filtercap;
    GstFlowReturn ret;
    GstBus *bus;
    GstElement *pipeline;

    gst_init (NULL,NULL);

    loop = g_main_loop_new (NULL, TRUE);

    //creazione della pipeline:
    pipeline = gst_pipeline_new ("gstreamer-encoder");
    if( ! pipeline ) {
        g_print("Error creating Pipeline, exiting...");
    }

    //creazione elemento appsrc:
    app-> videosrc = gst_element_factory_make ("appsrc", "videosrc");
    if( !  app->videosrc ) {
            g_print( "Error creating source element, exiting...");
    }

    //creazione elemento queue:
    app-> queue = gst_element_factory_make ("queue", "queue");
    if( !  app->queue ) {
            g_print( "Error creating queue element, exiting...");
    }

    app->videocoverter = gst_element_factory_make ("videoconvert", "videocoverter");
    if( ! app->videocoverter ) {
            g_print( "Error creating videocoverter, exiting...");
    }

    //creazione elemento filter:
    app->filter = gst_element_factory_make ("capsfilter", "filter");
    if( ! app->filter ) {
            g_print( "Error creating filter, exiting...");
    }

    app->h264enc = gst_element_factory_make ("omxh264enc", "h264enc");
    if( ! app->h264enc ) {
            g_print( "Error creating omxh264enc, exiting...");
    }

 app->h264parse = gst_element_factory_make ("h264parse", "h264parse");
    if( ! app->h264parse ) {
            g_print( "Error creating h264parse, exiting...");
    }
    app->flvmux = gst_element_factory_make ("flvmux", "flvmux");
    if( ! app->flvmux ) {
            g_print( "Error creating flvmux, exiting...");
    }
    app->rtmpsink = gst_element_factory_make ("rtmpsink", "rtmpsink");
    if( ! app->rtmpsink ) {
            g_print( "Error rtmpsink flvmux, exiting...");
    }



    g_print ("Elements are created\n");
    g_object_set (G_OBJECT (app->rtmpsink), "location" , "rtmp://192.168.3.107/rtmp/live live=1" ,  NULL);
    


    g_print ("end of settings\n");

    srccap = gst_caps_new_simple("video/x-raw",
            "format", G_TYPE_STRING, "RGB",
            "width", G_TYPE_INT, 640,
            "height", G_TYPE_INT, 480,
            //"width", G_TYPE_INT, 320,
            //"height", G_TYPE_INT, 240,
            "framerate", GST_TYPE_FRACTION, 30, 1,
            //"pixel-aspect-ratio", GST_TYPE_FRACTION, 1, 1,
        NULL);

    filtercap = gst_caps_new_simple("video/x-raw",
            "format", G_TYPE_STRING, "I420",
            "width", G_TYPE_INT, 640,
            "height", G_TYPE_INT, 480,
            //"width", G_TYPE_INT, 320,
            //"height", G_TYPE_INT, 240,
            "framerate", GST_TYPE_FRACTION, 30, 1,
        NULL);

    gst_app_src_set_caps(GST_APP_SRC( app->videosrc), srccap);
    g_object_set (G_OBJECT (app->filter), "caps", filtercap, NULL);
    bus = gst_pipeline_get_bus (GST_PIPELINE ( pipeline));
    g_assert(bus);
    gst_bus_add_watch ( bus, (GstBusFunc) bus_call, app);

 gst_bin_add_many (GST_BIN ( pipeline), app-> videosrc, app->queue, app->videocoverter,app->filter, app->h264enc,  app->h264parse, app->flvmux, app->rtmpsink, NULL);

    g_print ("Added all the Elements into the pipeline\n");

    int ok = false;
    ok = gst_element_link_many ( app-> videosrc, app->queue, app->videocoverter, app->filter,app->h264enc,  app->h264parse, app->flvmux, app->rtmpsink, NULL);


    if(ok)g_print ("Linked all the Elements together\n");
    else g_print("*** Linking error ***\n");

    g_assert(app->videosrc);
    g_assert(GST_IS_APP_SRC(app->videosrc));

    g_signal_connect (app->videosrc, "need-data", G_CALLBACK (start_feed), app);
    g_signal_connect (app->videosrc, "enough-data", G_CALLBACK (stop_feed),app);


    g_print ("Playing the video\n");
    gst_element_set_state (pipeline, GST_STATE_PLAYING);

    g_print ("Running...\n");
        g_main_loop_run ( loop);

    g_print ("Returned, stopping playback\n");
    gst_element_set_state (pipeline, GST_STATE_NULL);
    gst_object_unref ( bus);
    g_main_loop_unref (loop);
    g_print ("Deleting pipeline\n");


}

Source of my webpage

<!DOCTYPE html>
<meta content="text/html;charset=utf-8" http-equiv="Content-Type">
<meta content="utf-8" http-equiv="encoding">
<html>
<head>
<title>Live Streaming</title>

<!-- strobe -->
<script type="text/javascript" src="strobe/lib/swfobject.js"></script>
<script type="text/javascript">
  var parameters = {  
     src: "rtmp://192.168.3.107/rtmp/live",  
     autoPlay: true,  
     controlBarAutoHide: false,  
     playButtonOverlay: true,  
     showVideoInfoOverlayOnStartUp: true,  
     optimizeBuffering : false,  
     initialBufferTime : 0.1,  
     expandedBufferTime : 0.1,  
     minContinuousPlayback : 0.1,  
     //poster: "images/poster.png"  
  };  
  swfobject.embedSWF(
    "strobe/StrobeMediaPlayback.swf"
    , "StrobeMediaPlayback"
    , 1024
    , 768
    , "10.1.0"
    , "strobe/expressInstall.swf"
    , parameters
    , {
      allowFullScreen: "true"
    }
    , {
      name: "StrobeMediaPlayback"
    }
  );
</script>

</head>
<body>
<div id="StrobeMediaPlayback"></div>
</body>
</html>

解决方案

When using appsrc and appsink people usualy do some stuff with the buffers, sometimes they get data and process them in some way and then create new buffer but forget to timestamp it properly..

What is timestamping? Its appending time information to the audio/video buffers. Why? - its synchronisation mechanism for every application (vlc, web .. ) to display (present) video/audio to user in certain time in certain rate (this is the PTS)..

This has something to do with framerate (in video) or frequency (in audio - but timestamps work differently here - its not per every audio sample which has 4 bytes usually).

So what probably happend on your web side - it recieved buffers but without this timestamping information. So the app didnt know how/when to display the video so it silently failed and displayed nothing.

The GStreamer app worked because it apparently has some algorithms how to guess framerate etc..

As I said you have two options.

1, Calculating your PTS and duration yourselve with:

guint64 calculated_pts = some_cool_algorithm();
GstBuffer *buffer = gst_buffer_new(data);//your processed data
GST_BUFFER_PTS(buffer) = calculated_pts; // in nanoseconds
GST_BUFFER_DURATION(buffer) = 1234567890; // in nanoseconds
//push buffer to appsrc

2, Or by turning do-timestamp on for appsrc which will automatically generate timestamps - now I am not sure how is it doing - it either pick framerate from caps or it generate the PTS according to how you push frames into it.

这篇关于通过GStreamer的-1.0 appsrc RTMP流至rtmpsink的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆