通过 Websockets 的视频流 [英] Video Streaming Over Websockets

查看:40
本文介绍了通过 Websockets 的视频流的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试构建可以从双方流式传输视频的移动应用程序(即视频通话之类的东西).

我研究了 webrtc,但还没有为移动原生应用做好准备,无论如何 webrtc 所做的是允许浏览器直接捕获摄像头和音频而无需插件等.但在原生移动应用中捕获摄像头和音频不是这是一个问题,基本上需要非常低的延迟和双传输层.在许多文章和地方,我读到了关于在 websockets 上使用 webrtc 的内容.

所以我想我可以使用 websockets 流式传输视频.是正确的还是我遗漏了什么?

我知道还有一个区别,即 webrtc 是直接从客户端到客户端...而 websocket 是客户端-服务器-客户端,无论如何要避免它.这对延迟而言意味着什么.

解决方案

你错过了一些东西.

  • webRTC 在移动设备上运行良好.webrtc.org 上的参考代码 (appRTCDEMO) 中有适用于 iOS 和 android 的示例/白色客户端,以及多个应用程序.最后一个被宣布的出现在

  • 从流中获取视频和音频流是媒体 API 而不是 webRTC API (getusermedia).

  • webRTC 实际上是 p2p 连接(RTCPeerConnection:传输和防火墙穿越)和等式的媒体引擎(编码、打包、加密),这正是您正在寻找的.

  • webSockets 只是一种传输机制.它不处理防火墙/nat 遍历、媒体处理和打包/分块,而您必须在应用程序级别实现这些.

  • 信令而言,webRTC不指定/强加任何协议,看似通常的做法是设置信令服务器.该应用程序通常使用 Web 套接字或 XHR 或其他方式连接到该服务器以进行原始握手/呼叫设置.许多应用程序通过使用 socket.io 之类的库将其抽象化.

I am trying to build mobile app which can stream video from both sides(i.e something like video calling).

I looked into webrtc, but thats not yet ready for mobile native apps as such and anyways what webrtc was doing was allowing browser to capture camera and audio directly without requiring plugins etc. But in native mobile apps capturing camera and audio isn't a issue and basically a very low latency and dual transport layer is needed. In many articles and places I read about using webrtc over websockets.

So I thought I can stream the video using websockets. Is it correct or am I missing anything?

I understand that there is one more difference that webrtc is directly client to client...whereas websocket would be client-server-client, is there anyway to avoid it. And what would that mean it terms of latency.

解决方案

You're missing something.

  • webRTC works very well on mobile. There are examples/white clients in the reference code (appRTCDEMO) at webrtc.org for both iOS and android, and multiple apps out there. Last one to have been announced was appear.in

  • getting the video and audio stream from the stream is part of the media API and not webRTC API per say (getusermedia).

  • webRTC is really the p2p connection (RTCPeerConnection: transport and firewall traversal) and the media engine (encoding, packeting, encrypting) of the equation and exactly what you re looking for.

  • webSockets is just a transport mechanism. It does not handle firewall/nat traversal, media processing, and packeting/chunking that you would have then to implement at application level.

  • as far as the signaling is concerned, webRTC does not specify/impose any protocol, and the seemingly usual way to do it is to set up a signaling server. The app usually connects to that server using web socket or XHR, or something else to do the original handshake/call set up. Many app abstracts this by using libraries like socket.io.

这篇关于通过 Websockets 的视频流的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆