如何从充当服务器的 iPhone 流式传输视频? [英] How do I stream video from iPhone acting as a server?

查看:32
本文介绍了如何从充当服务器的 iPhone 流式传输视频?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在开发一款适用于 iOS 的应用,其中一部 iPhone 必须将其摄像头记录实时流式传输到另一部 iPhone(为简单起见,两者都在同一个 Wi-Fi 网络中).
流式传输应该没有物理互连(例如,用于将流路由到客户端的服务器).实际上,录制 iPhone 应该是服务器,它为其他 iPhone(或网络中的更多其他 iOS 设备)提供实时流服务.

所以,我需要做的是:

I'm working on an app for iOS, where one iPhone has to live stream its camera recordings to another iPhone (to keep things simple, both are in the same Wi-Fi network).
The streaming should work without a physical interconnect (e.g. a server used for routing the stream to clients). In fact, the recording iPhone should be the server which serves the other iPhone (or more other iOS devices in the network) with the live stream.

So, what I need to do is:

  1. 从相机获取实时图片
  2. 如果需要,处理这些数据
  3. 逐帧向连接的客户端发送 (TCP?)
  4. 在客户端接收帧并实时显示

我所拥有的以及我所坚持的:

What I have and what I'm stuck with:

  1. 我已经解决了问题 1.我使用了一个 AVCaptureSession,它不断返回 CMSampleBufferRef(找到 此处).

  1. I have already solved problem 1. I use an AVCaptureSession which is constantly returning CMSampleBufferRef's (found here).

我还不太确定我需要对 CMSampleBufferRef 做什么.我知道如何将其转换为 CGImageUIImage(感谢 Benjamin Loulier 的精彩博文2),但我不知道我需要特别发送什么以及是否需要对帧进行编码不知何故.
正如@jab 在上面链接的答案中提到的(this) 可以将这些样本写入具有一个或多个 AVAssetWriter 的文件.但话又说回来,他说那些 5 秒的视频片段将上传到服务器,该服务器可以从中制作出可流式传输的电影文件(然后该电影可以通过 HTTP Live Streaming 流式传输到 iOS 设备我假设).

I'm not so sure yet what I need to do with the CMSampleBufferRef. I do know how to transform it into a CGImage or a UIImage (thanks to Benjamin Loulier's great blogpost2), but I have no idea of what specifically I need to send and if I need to encode the frames somehow.
As mentioned by @jab in the above linked answer (this) it is possible to write those samples to a file with one or more AVAssetWriter's. But then again he says those 5 sec video snippets are to be uploaded to a server which makes a streamable movie file out of them (and that movie can then be streamed to an iOS device by HTTP Live Streaming I suppose).

正如我已经指出的,我的应用程序(即视频捕获服务器"设备)有一个或多个客户端连接到它,并且需要向它们实时发送视频帧.
我想到的一个想法是使用简单的 TCP 连接,其中服务器以序列化格式将每一帧发送到循环中连接的客户端.更具体地说:当一个缓冲的帧成功发送到客户端时,服务器将最近的帧作为下一个要发送的帧.
现在:这是应该如何运作的正确想法吗?或者是否有另一种协议更适合这种任务?
请记住:我想保持简单(对我来说很简单,也就是说,我不需要学习太多新的编程方面)和快速.我已经了解了一些关于 TCP 的知识,我在学校用 C 用它编写了服务器和客户端,所以我更愿意将我现在拥有的知识应用到这个项目中......

As I already indicated, my app (i.e. the video capturing "server" device) has one or multiple clients connected to it and needs to send the video frames in real time to them.
One idea which came to my mind is to use a simple TCP connection where the server sends every single frame in a serialised format to the connected clients in a loop. More specifically: when one buffered frame is successfully sent to the client, the server takes the most recent frame as the next one to be sent.
Now: is this the right thought how it should work? Or is there another protocol, which is much better suited for this kind of task?
Remember: I want to keep it simple (simple for me, i.e., so that I don't need to study too many new programming aspects) and fast. I already know some things about TCP, I wrote servers and clients with it at school in C, so I'd prefer to apply the knowledge I have now to this project...

最后但并非最不重要的是,接收客户端:
我想,如果我真的要使用 TCP 连接,那么在客户端我从服务器接收一帧又一帧,将读取的字节包转换为使用的格式(CMSampleBuffer, CGImage, UIImage) 并将其显示在 CALayerUIImageView 上,对吗?只需不断更新图像视图,即可获得电影效果.

Last but not least, the receiving client:
I imagine, if I'm really going to use a TCP connection, that on the client-side I receive frame after frame from the server, cast the read byte package into the used format (CMSampleBuffer, CGImage, UIImage) and just display it on a CALayer or UIImageView, right? The effect of a movie will be gotten by just constantly keeping updated that image view.

请给我一些关于如何实现这一目标的想法.这很重要,因为它是我的毕业项目的一部分......任何示例代码也很受欢迎;-) 或者只是将我推荐给另一个站点、教程、Stackoverflow-answer 等.

Please give me some ideas on how to reach this goal. It is very important, because it's part of my school-graduation project... Any sample code is also appreciated ;-) Or just refer me to another site, tutorial, Stackoverflow-answer, etc.

如果您对此有任何疑问,请发表评论,我会更新帖子.

If you have any question to this, just leave a comment and I'll update the post.

推荐答案

  1. 听起来好吗?

  1. Sounds OK?

视频帧真的很大.将视频从一台设备流式传输到另一台设备时会遇到带宽问题.您可以使用 UIImage 中的 UIImageJPEGRepresentation 将帧压缩为 JPEGs,但这在服务器"上的计算成本很高,并且仍然可能无法它们足够小,可以很好地流动.您还可以通过丢帧、对 UIImage 进行下采样以及调整 AVCaptureSession 中的设置来降低帧速率和/或分辨率.或者,您可以发送小型(5 秒)视频,这些视频在服务器上进行了硬件压缩,在带宽方面更容易处理,但当然会让您的视频流延迟 5 秒.

Video frames are really big. You're going to have bandwidth problems streaming video from one device to another. You can compress the frames as JPEGs using UIImageJPEGRepresentation from a UIImage, but that's computationally expensive on the "server", and still may not make them small enough to stream well. You can also reduce your frame rate and/or resolution by dropping frames, downsampling the UIImages, and fiddling with the settings in your AVCaptureSession. Alternately, you can send small (5-second) videos, which are hardware-compressed on the server and much easier to handle in bandwidth, but will of course give you a 5-second lag in your stream.

如果您需要 iOS 7,我建议您尝试 MultipeerConnectivity.framework.设置起来并不难,我相信它支持多个客户端.如果您要部署自己的网络,请务必使用 UDP 而不是 TCP - 这是 UDP 的教科书应用程序,并且开销较低.

If you can require iOS 7, I'd suggest trying MultipeerConnectivity.framework. It's not terribly difficult to set up, and I believe it supports multiple clients. Definitely use UDP rather than TCP if you're going to roll your own networking - this is a textbook application for UDP, and it has lower overhead.

逐帧,把JPEGs转成UIImages,使用UIImageView即可.涉及大量计算,但我相信您仍然会受到带宽而不是 CPU 的限制.如果您要发送小视频,则可以使用 MPMoviePlayerController.每个视频之间可能会有一些小故障,因为它为播放准备"它们,这也将导致需要 5.5 秒左右来播放每个 5 秒的视频.我不建议使用 HTTP Live Streaming,除非您可以在某处混合使用真正的服务器.或者,如果您可以/想要为 iOS 编译 ffmpeg,您可以使用 ffmpeg 管道——输入视频并弹出单个帧.

Frame by frame, just turn the JPEGs into UIImages and use UIImageView. There's significant computation involved, but I believe you'll still be limited by bandwidth rather than CPU. If you're sending little videos, you can use MPMoviePlayerController. There will probably be little glitches between each video as it "prepares" them for playback, which will also result in requiring 5.5 seconds or so to play each 5-second video. I wouldn't recommend using HTTP Live Streaming unless you can get a real server into the mix somewhere. Or you could use an ffmpeg pipeline -- feed videos in and pop individual frames out -- if you can/want to compile ffmpeg for iOS.

如果您需要澄清其中任何一点,请告诉我.工作量很大,但相对简单.

Let me know if you need clarification on any of these points. It's a lot of work but relatively straightforward.

这篇关于如何从充当服务器的 iPhone 流式传输视频?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆