如何将实时视频从iOS设备流式传输到服务器? [英] How can live video be streamed from an iOS device to a server?
问题描述
我希望能够将实时视频从iOS设备流传输到服务器.我尝试使用AVCaptureOutput
将每个帧捕获为CMSampleBuffer
并使用AVAssetWriter
附加它,但是我不知道何时或如何从文件中获取输入并将其发送到服务器.应该如何格式化?我怎么知道什么时候发送?
I want to be able to stream live video from an iOS device to a server. I tried to use an AVCaptureOutput
that captures each frame as a CMSampleBuffer
and appends it using an AVAssetWriter
, but I don't know when or how to take the input from the file and send it to the server. How should it be formatted? How do I know when to send it?
推荐答案
尽管我不与您共享任何代码,但我与您共享我在一个应用程序中所做的逻辑.
Though i am not sharing any code with you, I am sharing my logic with you what i have done in one of my app.
第一种方法(简单的方法):有很多低成本的第三方库可供您使用.
First way(The easy one): There are lots of low cost third party library available for your use.
第二种方法(困难的一种):创建一小段视频(例如2秒或更短),将其排入队列并上传到服务器上(不要使用afnetworking
或node.js
或其他)的过程.并保留一个文本文件或数据库条目,以跟踪块文件及其顺序.上传完第一个块后,您可以使用ffmpg
从实际块中制作视频,将更多块添加到主视频文件中,如果您在设备上播放实际视频,则没有要进行更多修改,一旦在服务器上进行了更改,它将自动获取新零件.
Second way(The hard one): Create small chunk of video for example 2sec or less, keep them in queue and upload it on the server(don't use afnetworking
or http
method it will slow down the process use some chat server like node.js
or other). And keep one text file or db entry where you keep the track of the chunk file and its sequence. And once your first chunk is uploaded you can use ffmpg
to make a video from the actual chunk, the more chunk you got add them in the main video file, and if you play the actual video on the device you don't have to do any more modification it will automatically fetch the new part once it is changed on the server.
谢谢.希望对您有帮助.
Thank You. Hope it helps you.
这篇关于如何将实时视频从iOS设备流式传输到服务器?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!