WebRTC iOS音频聊天 [英] WebRTC iOS Audio Chat

查看:156
本文介绍了WebRTC iOS音频聊天的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在创建仅语音(无视频)聊天应用程序.我已经创建了自己的基于node.js/socket.io的服务器以进行信号传输.

I am creating a voice only (no video) chat application. I have created my own node.js/socket.io based server for signaling.

对于WebRTC,我使用以下窗格: https://cocoapods.org/pods/WebRTC

For WebRTC, I am using the following pod: https://cocoapods.org/pods/WebRTC

我已经成功创建了对等连接,添加了本地流,设置了本地/远程sdp以及发送/接收了ice候选对象. "didAddStream"委托方法也被称为成功具有音轨,但是我被困在这里.我不知道该如何处理音轨.下一步应该怎么做?我如何在双方之间发送/接收音频?

I have been successful in creating peer connection, adding local stream, setting local/remote sdp, and send/receive ice candidates. The "didAddStream" delegate method is also called successfully having audio tracks but I am stuck here. I don't know what should I do with the audio track. What should be the next step? How would I send/receive audio on both sides?

此外,如果我集成了CallKit,则需要进行哪些更改.

Also, if I integrate CallKit, what changes do I need to make.

推荐答案

我也陷入了困境.您必须保留RTCMediaStream对象才能播放音频.您无需使用RTCAudioTrack进行任何操作,它将自动播放.我只是将其分配给属性,以便可以保留它.在此处查看我的示例: https://github.com/redfearnk/WebRTCVideoChat/blob/master/WebRTCVideoChat/WebRTCClient.swift#L143

I got stuck on this one too. You have to retain the RTCMediaStream object in order for the audio to play. You don't need to do anything with the RTCAudioTrack, it will play automatically. I simply assign it to property so it can get retained. See my example here: https://github.com/redfearnk/WebRTCVideoChat/blob/master/WebRTCVideoChat/WebRTCClient.swift#L143

这篇关于WebRTC iOS音频聊天的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆