WebRTC到Gstreamer Bridge [英] WebRTC to Gstreamer Bridge

查看:104
本文介绍了WebRTC到Gstreamer Bridge的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试将音频从浏览器流式传输到服务器上的gstreamer管道.

I'm trying to stream audio from a browser to a gstreamer pipeline on a server.

我目前正在使用Kurento,并修改了Hello World示例以尝试将RTP端点连接到管道-但遇到了麻烦.

I'm currently using Kurento, and modifying the Hello World example to try to connect an RTP Endpoint to the pipeline -- but am having trouble.

我知道媒体正在到达那里,因为当我在记录端点"中交换时,会得到有效的记录.

I know the media is getting there because when I swap in a Recording Endpoint, I get a valid recording.

Kurento节点JS是:

The Kurento Node JS is:

pipeline.create("RtpEndpoint", {}, function(error, rtpEndpoint) {
        if (error) {
            console.log("Recorder problem");
            return sendError(res, 500, error);
        }

        console.log("Creating WebRtcEndpoint");
        pipeline.create('WebRtcEndpoint', function(error, webRtcEndpoint) {
            if (error) {
                return sendError(res, 500, error);
            }

            console.log("Processing sdpOffer at server and generating sdpAnswer");
            webRtcEndpoint.processOffer(sdpOffer, function(error, sdpAnswer) {
                if (error) {
                    webRtcEndpoint.release();
                    return sendError(res, 500, error);
                }

                console.log("Connecting loopback");
                webRtcEndpoint.connect(webRtcEndpoint, function(error) {
                    if(error){
                        webRtcEndpoint.release();
                        return sendError(res, 500, error);
                    }
                    console.log("Sending sdpAnswer to client");
                    console.log(sdpAnswer);

                    webRtcEndpoint.connect(rtpEndpoint, function(error) {
                        if(error) {
                            webRtcEndpoint.release();
                            return sendError(res, 500, error);
                        }
                        rtpEndpoint.generateOffer(function(error, offer) {
                            fs.writeFile('/tmp/test.sdp',offer);
                            console.log("RTP OFFER GENERATED.");
                        });
                    });

                    res.type('application/sdp');
                    res.send(sdpAnswer);
                });     
            });
        });
    });

我的GStreamer管道是:

and my GStreamer pipeline is:

gst-launch-1.0 -vvvv filesrc location=/tmp/test.sdp ! sdpdemux ! decodebin ! autovideosink

返回

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Got context from element 'autovideosink0-actual-sink-glimage': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayX11\)\ gldisplayx11-0";
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstSDPDemux:sdpdemux0/GstUDPSrc:udpsrc0: timeout = 10000000000
/GstPipeline:pipeline0/GstSDPDemux:sdpdemux0/GstUDPSrc:udpsrc2: timeout = 10000000000
/GstPipeline:pipeline0/GstSDPDemux:sdpdemux0/GstRtpBin:rtpbin0/GstRtpSession:rtpsession0.GstPad:send_rtcp_src: caps = application/x-rtcp
/GstPipeline:pipeline0/GstSDPDemux:sdpdemux0/GstRtpBin:rtpbin0.GstGhostPad:send_rtcp_src_0: caps = application/x-rtcp
/GstPipeline:pipeline0/GstSDPDemux:sdpdemux0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtcp
/GstPipeline:pipeline0/GstSDPDemux:sdpdemux0/GstRtpBin:rtpbin0.GstGhostPad:send_rtcp_src_0.GstProxyPad:proxypad4: caps = application/x-rtcp
/GstPipeline:pipeline0/GstSDPDemux:sdpdemux0/GstRtpBin:rtpbin0/GstRtpSession:rtpsession1.GstPad:send_rtcp_src: caps = application/x-rtcp
/GstPipeline:pipeline0/GstSDPDemux:sdpdemux0/GstRtpBin:rtpbin0.GstGhostPad:send_rtcp_src_1: caps = application/x-rtcp
/GstPipeline:pipeline0/GstSDPDemux:sdpdemux0/GstUDPSink:udpsink1.GstPad:sink: caps = application/x-rtcp
/GstPipeline:pipeline0/GstSDPDemux:sdpdemux0/GstRtpBin:rtpbin0.GstGhostPad:send_rtcp_src_1.GstProxyPad:proxypad7: caps = application/x-rtcp
ERROR: from element /GstPipeline:pipeline0/GstSDPDemux:sdpdemux0: Could not read from resource.
Additional debug info:
gstsdpdemux.c(1213): gst_sdp_demux_handle_message (): /GstPipeline:pipeline0/GstSDPDemux:sdpdemux0:
Could not receive any UDP packets for 10.0000 seconds, maybe your firewall is blocking it.
Execution ended after 0:00:10.062018001
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

它不适用于FFPMEG,VLC等-结果类似于此处的尝试5.3":

It doesnt work on FFPMEG, VLC, etc -- results are similar to "Attempt 5.3" here: https://altanaitelecom.wordpress.com/2015/02/26/continue-streaming-broadcasting-live-video-call-to-non-webrtc-supported-browsers-and-media-players/

我不认为存在防火墙问题,因为管道和kurento实例位于同一虚拟机(没有防火墙)上,并且记录端点有效.链接不好吗?有没有更简单的方法?

I don't think theres a firewall issue as the pipeline and kurento instance are on the same virtual machine (which has no firewall) -- and the recording endpoint works. Is it being linked badly? Is there an easier way?

推荐答案

使用RtpEndpoint很棘手,因为您需要完成de SDP协商.这意味着

Using RtpEndpoint is tricky because you need to complete de SDP negotiation. This means that somewhere after the

rtpEndpoint.generateOffer(...

您应该调用

rtpEndpoint.processAnswer(sdpAnswer, ...)

棘手的部分是,您需要从gstreamer管道中获取sdpAnswer,如果您只想使用gst-launch来获取它,那么这并非易事.最好的选择可能是编写一个小的程序来创建管道并生成sdpAnswer,以便您可以通过信令机制将其返回给rtpEndpoint.

The tricky part is that you need to obtain the sdpAnswer from your gstreamer pipeline and this is not trivial if you want to it just using gst-launch. Probably your best option is to write a small programm creating the pipeline and generating the sdpAnswer so that you can give it back to the rtpEndpoint through your signaling mechanism.

这篇关于WebRTC到Gstreamer Bridge的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆