如何从iPhone充当服务器流式传输视频? [英] How do I stream video from iPhone acting as a server?

查看:183
本文介绍了如何从iPhone充当服务器流式传输视频?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在研发适用于iOS的应用程序,其中一部iPhone必须将其摄像机录像直播到另一部iPhone(为了简单起见,两者都在同一个Wi-Fi网络中)。

流式传输应该没有物理互连(例如,用于将流路由到客户端的服务器)。实际上,录制iPhone应该是服务器,它可以为实时流提供其他iPhone(或网络中的其他iOS设备)。



所以,我需要做的是:

I'm working on an app for iOS, where one iPhone has to live stream its camera recordings to another iPhone (to keep things simple, both are in the same Wi-Fi network).
The streaming should work without a physical interconnect (e.g. a server used for routing the stream to clients). In fact, the recording iPhone should be the server which serves the other iPhone (or more other iOS devices in the network) with the live stream.

So, what I need to do is:


  1. 从中获取实时图片相机

  2. 如果需要,处理此数据

  3. 逐帧发送到连接的客户端( TCP?

  4. 接收客户端上的帧并实时显示

  1. Get the live pictures from the camera
  2. Process this data if needed
  3. Send frame by frame to the connected clients (TCP?)
  4. Receive the frames on the client and display them in real time

我拥有什么,我是什么我坚持:

What I have and what I'm stuck with:


  1. 我已经解决了问题1.我使用 AVCaptureSession 不断返回 CMSampleBufferRef (找到这里

我不太确定我需要用 CMSampleBufferRef 做什么。我知道如何将其转换为 CGImage UIImage (感谢Benjamin Loulier的精彩博文 2 ),但我不明白具体是什么我需要发送,如果我需要以某种方式对帧进行编码。

正如@jab在上面链接的答案中提到的那样(这个)可以将这些样本写入一个文件或一个或更多 AVAssetWriter 。但他再次说这些5秒的视频片段将被上传到服务器,该服务器制作一个可流式电影文件(然后该电影可以通过 HTTP直播流传输到iOS设备我想)。

I'm not so sure yet what I need to do with the CMSampleBufferRef. I do know how to transform it into a CGImage or a UIImage (thanks to Benjamin Loulier's great blogpost2), but I have no idea of what specifically I need to send and if I need to encode the frames somehow.
As mentioned by @jab in the above linked answer (this) it is possible to write those samples to a file with one or more AVAssetWriter's. But then again he says those 5 sec video snippets are to be uploaded to a server which makes a streamable movie file out of them (and that movie can then be streamed to an iOS device by HTTP Live Streaming I suppose).

正如我已经指出的那样,我的应用程序(即视频捕获服务器设备)连接了一个或多个客户端它需要实时发送视频帧给他们。
我想到的一个想法是使用简单的 TCP 连接,其中服务器以序列化格式将每一帧发送到连接客户端循环。更具体地说:当一个缓冲帧成功发送到客户端时,服务器将最新帧作为下一个要发送的帧。

现在:这是正确的想法它应该如何工作?或者是否有其他协议,它更适合这种任务?
请记住:我想保持简单(对我而言简单,即,我不需要研究太多新的编程方面)和快速。我已经了解了一些关于TCP的事情,我在学校用 C 写了服务器和客户端,所以我更愿意将我现有的知识应用到这个项目中。 。

As I already indicated, my app (i.e. the video capturing "server" device) has one or multiple clients connected to it and needs to send the video frames in real time to them.
One idea which came to my mind is to use a simple TCP connection where the server sends every single frame in a serialised format to the connected clients in a loop. More specifically: when one buffered frame is successfully sent to the client, the server takes the most recent frame as the next one to be sent.
Now: is this the right thought how it should work? Or is there another protocol, which is much better suited for this kind of task?
Remember: I want to keep it simple (simple for me, i.e., so that I don't need to study too many new programming aspects) and fast. I already know some things about TCP, I wrote servers and clients with it at school in C, so I'd prefer to apply the knowledge I have now to this project...

最后但并非最不重要的是,接收客户端:

我想,如果我真的要使用 TCP 连接,在客户端我从服务器接收帧后,将读取的字节包转换为使用的格式( CMSampleBuffer CGImage UIImage )并将其显示在 CALayer UIImageView ,对吧?通过不断更新图像视图,可以获得电影的效果。

Last but not least, the receiving client:
I imagine, if I'm really going to use a TCP connection, that on the client-side I receive frame after frame from the server, cast the read byte package into the used format (CMSampleBuffer, CGImage, UIImage) and just display it on a CALayer or UIImageView, right? The effect of a movie will be gotten by just constantly keeping updated that image view.

请给我一些想法如何实现这一目标。这是非常重要的,因为它是我的学校毕业项目的一部分......任何示例代码也很受欢迎;-)或者只是引用我到另一个站点,教程,Stackoverflow-answer等。

Please give me some ideas on how to reach this goal. It is very important, because it's part of my school-graduation project... Any sample code is also appreciated ;-) Or just refer me to another site, tutorial, Stackoverflow-answer, etc.

如果您对此有任何疑问,请发表评论我会更新帖子。

If you have any question to this, just leave a comment and I'll update the post.

推荐答案


  1. 听起来不错?

  1. Sounds OK?

视频帧真的很大。您将遇到带宽问题,将视频从一个设备流式传输到另一个设备。您可以使用 UIImageJPEGRepresentation UIImage JPEG c>,但这在服务器上的计算成本很高,但仍然可能无法使它们足够小以便流畅。您还可以通过删除帧,下采样 UIImage 来减少帧速率和/或分辨率,并摆弄 AVCaptureSession 。或者,您可以发送小型(5秒)视频,这些视频在服务器上进行硬件压缩,并且更容易处理带宽,但当然会在您的流中为您提供5秒的延迟。

Video frames are really big. You're going to have bandwidth problems streaming video from one device to another. You can compress the frames as JPEGs using UIImageJPEGRepresentation from a UIImage, but that's computationally expensive on the "server", and still may not make them small enough to stream well. You can also reduce your frame rate and/or resolution by dropping frames, downsampling the UIImages, and fiddling with the settings in your AVCaptureSession. Alternately, you can send small (5-second) videos, which are hardware-compressed on the server and much easier to handle in bandwidth, but will of course give you a 5-second lag in your stream.

如果您需要iOS 7,我建议您尝试 MultipeerConnectivity.framework 。设置并不是非常困难,我相信它支持多个客户端。如果你打算推出自己的网络,肯定使用UDP而不是TCP - 这是UDP的教科书应用程序,它的开销较低。

If you can require iOS 7, I'd suggest trying MultipeerConnectivity.framework. It's not terribly difficult to set up, and I believe it supports multiple clients. Definitely use UDP rather than TCP if you're going to roll your own networking - this is a textbook application for UDP, and it has lower overhead.

框架按框架,只需将 JPEG 转换为 UIImage s并使用 UIImageView 。涉及到重要的计算,但我相信你仍然会受到带宽而不是CPU的限制。如果您要发送小视频,可以使用 MPMoviePlayerController 。每个视频之间可能会有很少的故障,因为它准备它们进行播放,这也需要5.5秒左右才能播放每个5秒的视频。我不建议使用HTTP Live Streaming,除非你可以在某个地方使用真正的服务器。或者您可以使用 ffmpeg 管道 - 输入视频并弹出单个帧 - 如果您可以/想要编译 ffmpeg 适用于iOS。

Frame by frame, just turn the JPEGs into UIImages and use UIImageView. There's significant computation involved, but I believe you'll still be limited by bandwidth rather than CPU. If you're sending little videos, you can use MPMoviePlayerController. There will probably be little glitches between each video as it "prepares" them for playback, which will also result in requiring 5.5 seconds or so to play each 5-second video. I wouldn't recommend using HTTP Live Streaming unless you can get a real server into the mix somewhere. Or you could use an ffmpeg pipeline -- feed videos in and pop individual frames out -- if you can/want to compile ffmpeg for iOS.

如果您需要澄清以上任何一点,请与我们联系。这是很多工作,但相对简单。

Let me know if you need clarification on any of these points. It's a lot of work but relatively straightforward.

这篇关于如何从iPhone充当服务器流式传输视频?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆