iPhone:没有任何服务器端处理的HTTP实时流媒体 [英] iPhone: HTTP live streaming without any server side processing

查看:158
本文介绍了iPhone:没有任何服务器端处理的HTTP实时流媒体的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我希望能够(实时)将iPhone /摄像头的帧/视频流传输到互联网。我在线程中看到(从iPhone流式传输视频)可以使用它AVCaptureSession的beginConfiguration和commitConfiguration。但我不知道如何开始设计这项任务。已经有很多关于如何将视频流式传输到iPhone的教程,实际上并不是我要搜索的内容。

I want to be able to (live) stream the frames/video FROM the iPhone camera to the internet. I've seen in a Thread (streaming video FROM an iPhone) that it's possible using AVCaptureSession's beginConfiguration and commitConfiguration. But I don't know how to start designing this task. There are already a lot of tutorials about how to stream video TO the iPhone, and it is not actually what I am searching for.

你能不能给我任何想法可以帮助我吗?

Could you guys give me any ideas which could help me further?

推荐答案

这是一个棘手的问题。你应该能够做到,但这并不容易。

That's a tricky one. You should be able to do it, but it won't be easy.

一种无法实现的方式(不回答您的需求,但值得一提)是从相机捕获并将其保存到视频文件中。请参阅AV基金会指南,了解如何执行此操作。保存后,您可以使用HTTP Live Streaming分段器生成适当的段。 Apple有适用于Mac OSX的应用程序,但也有一个开源版本,你可以适应iOS。最重要的是,您还必须运行http服务器来为这些段提供服务。你可以调整很多http服务器。

One way that wouldn't be live (not answering your need, but worth mentioning) is to capture from the camera and save it to a video file. see the AV Foundation Guide on how to do that. Once saved you can then use the HTTP Live Streaming segmenter to generate the proper segments. Apple has applications for Mac OSX, but there's an open source version as well that you could adapt for iOS. On top of that, you'd also have to run an http server to serve those segments. Lots of http servers out there you could adapt.

但要实现它,首先你已经找到了,你需要从相机中收集帧。一旦你有那些你想将它们转换为h.264。为此你想要ffmpeg。基本上你把图像推到ffmpeg的AVPicture,制作一个流。然后,您需要管理该流,以便实时流媒体分段器将其识别为实时流媒体h.264设备。我不知道该怎么做,这听起来像是一些认真的工作。完成后,您需要有一个http服务器,为该流提供服务。

But to do it live, first as you have already found, you need to collect frames from the camera. Once you have those you want to convert them to h.264. For that you want ffmpeg. Basically you shove the images to ffmpeg's AVPicture, making a stream. Then you'd need to manage that stream so that the live streaming segmenter recognized it as a live streaming h.264 device. I'm not sure how to do that, and it sounds like some serious work. Once you've done that, then you need to have an http server, serving that stream.

实际上更容易使用基于RTP / RTSP的流代替。 RTP的开源版本涵盖了这种方法,ffmpeg完全支持这种方法。这不是http直播,但它会运作得很好。

What might actually be easier would be to use an RTP/RTSP based stream instead. That approach is covered by open source versions of RTP and ffmpeg supports that fully. It's not http live streaming, but it will work well enough.

这篇关于iPhone:没有任何服务器端处理的HTTP实时流媒体的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆