从Android设备发送使用libstreaming的上游化的RTSP多播音频 [英] send a multicast audio in rtsp using libstreaming for upstreaming from an android device

查看:2768
本文介绍了从Android设备发送使用libstreaming的上游化的RTSP多播音频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在code仅流媒体一次一个用户。谁能帮我在同一时间(将其转换为多播或广播)在多个系统上播放的流。 先谢谢了。

图书馆来源是在这里: https://github.com/fyhertz/libstreaming

我目前的code是:

  mSurfaceView =(net.majorkernelpanic.streaming.gl.SurfaceView)findViewById(R.id.surface);

    //设置RTSP服务器的1234端口
    编辑EDITOR = preferenceManager.getDefaultShared preferences(本).edit();
    editor.putString(RtspServer.KEY_PORT,将String.valueOf(5060)); //音频端口NUM
    editor.commit();

    //配置的SessionBuilde
    SessionBuilder.getInstance()
    .setSurfaceView(mSurfaceView)
    .SET previewOrientation(90)
    .setContext(getApplicationContext())
    .setAudioEn codeR(SessionBuilder.AUDIO_AAC)
    .setVideoEn codeR(SessionBuilder.VIDEO_NONE);


    MainActivity.this.startService(新意图(MainActivity.this,RtspServer.class));
 

解决方案

我看了看code在GitHub上,似乎你只需要指定多播地址的 SessionBuilder 类,然后下面的RTSP服务器和RTP传输应该处理一切(至少RTSP回应似乎有code,以产生正确的运输说明)。所以我想加入 setDestination 调用你的 SessionBuilder 配置应该没问题(替换用地址232.0.1.2您需要):

  //配置的SessionBuilde
SessionBuilder.getInstance()
.setSurfaceView(mSurfaceView)
.SET previewOrientation(90)
.setContext(getApplicationContext())
.setAudioEn codeR(SessionBuilder.AUDIO_AAC)
.setVideoEn codeR(SessionBuilder.VIDEO_NONE)
.setDestination(232.0.1.2);
 

客户端将仍然连接到RTSP服务器通过它的地址,但实际的RTP流应该是单一和所有客户端之间共享。

The code is only streaming for one user at a time. Can anyone help me to play the stream in more than one system at the same time(convert it to multicast or broadcast). Thanks in advance.

The library source is over here: https://github.com/fyhertz/libstreaming

my current code is:

    mSurfaceView = (net.majorkernelpanic.streaming.gl.SurfaceView) findViewById(R.id.surface);

    // Sets the port of the RTSP server to 1234
    Editor editor = PreferenceManager.getDefaultSharedPreferences(this).edit();
    editor.putString(RtspServer.KEY_PORT, String.valueOf(5060));                                                            // audio port num               
    editor.commit();

    // Configures the SessionBuilde
    SessionBuilder.getInstance()
    .setSurfaceView(mSurfaceView)
    .setPreviewOrientation(90)
    .setContext(getApplicationContext())
    .setAudioEncoder(SessionBuilder.AUDIO_AAC)
    .setVideoEncoder(SessionBuilder.VIDEO_NONE);


    MainActivity.this.startService(new Intent(MainActivity.this,RtspServer.class));

解决方案

I looked at the code at github and seems that you only need to specify the multicast address to the SessionBuilder class and then the underlying RTSP server and RTP transport should handle everything (at least the RTSP responses seem have code to produce correct transport descriptions). So I guess adding a setDestination call to your SessionBuilder configuration should be ok (replace the 232.0.1.2 with the address you need):

// Configures the SessionBuilde
SessionBuilder.getInstance()
.setSurfaceView(mSurfaceView)
.setPreviewOrientation(90)
.setContext(getApplicationContext())
.setAudioEncoder(SessionBuilder.AUDIO_AAC)
.setVideoEncoder(SessionBuilder.VIDEO_NONE)
.setDestination("232.0.1.2");

The clients will still connect to the RTSP server through it's address but the actual RTP stream should be single and shared among all the clients.

这篇关于从Android设备发送使用libstreaming的上游化的RTSP多播音频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆