从iPhone流式传输视频 [英] streaming video FROM an iPhone
问题描述
我可以从iPhone的相机中获得单独的帧。我需要的是一种用声音打包它们以便流式传输到服务器的方法。我拥有它们后发送文件不是什么大问题。它生成的流媒体文件我遇到了问题。我一直在努力让FFMpeg在没有太多运气的情况下工作。
I can get individual frames from the iPhone's cameras just fine. what I need is a way to package them up with sound for streaming to the server. Sending the files once I have them isn't much of an issue. Its the generation of the files for streaming that I am having problems with. I've been trying to get FFMpeg to work without much luck.
任何人对我如何解决这个问题都有任何想法?我想要一个已知的工作API或关于让FFMpeg在iPhone应用程序中正确编译的说明。
Anyone have any ideas on how I can pull this off? I would like a known working API or instructions on getting FFMpeg to compile properly in an iPhone app.
推荐答案
你可以划分录音分隔长度为10秒的文件,然后单独发送。如果您使用 AVCaptureSession
的 beginConfiguration
和 commitConfiguration
方法批量输出更改您不应删除文件之间的任何帧。与逐帧上传相比,这有许多优点:
You could divide your recording to separate files with a length of say, 10sec, then send them separately. If you use AVCaptureSession
's beginConfiguration
and commitConfiguration
methods to batch your output change you shouldn't drop any frames between the files. This has many advantages over frame by frame upload:
- 这些文件可以直接用于HTTP实时流媒体,无需任何服务器端处理。
- 如果连接足够快,数据传输之间的间隙允许天线在两者之间休眠,从而节省电池寿命。
- 相反,如果连接速度很慢所以上传比录制慢,管理一组文件的延迟上传要比字节流容易得多。
这篇关于从iPhone流式传输视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!