从 iPhone 上传实时流媒体视频,如 Ustream 或 Qik [英] Upload live streaming video from iPhone like Ustream or Qik

查看:25
本文介绍了从 iPhone 上传实时流媒体视频,如 Ustream 或 Qik的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如何将视频从 iPhone 直播到 Ustream 或 Qik 等服务器?我知道 Apple 有一种叫做 Http Live Streaming 的东西,但我发现的大多数资源都只讨论从服务器到 iPhone 的流媒体视频.

How to live stream videos from iPhone to server like Ustream or Qik? I know there's something called Http Live Streaming from Apple, but most resources I found only talks about streaming videos from server to iPhone.

我应该使用 Apple 的 Http Living Streaming 吗?或者是其他东西?谢谢.

Is Apple's Http Living Streaming something I should use? Or something else? Thanks.

推荐答案

据我所知,没有内置的方法可以做到这一点.正如您所说,HTTP Live Streaming 用于下载到 iPhone.

There isn't a built-in way to do this, as far as I know. As you say, HTTP Live Streaming is for downloads to the iPhone.

我这样做的方法是实现一个 AVCaptureSession,它有一个委托,带有一个在每一帧上运行的回调.该回调将每个帧通过网络发送到服务器,服务器有一个自定义设置来接收它.

The way I'm doing it is to implement an AVCaptureSession, which has a delegate with a callback that's run on every frame. That callback sends each frame over the network to the server, which has a custom setup to receive it.

流程如下:https://developer.apple.com/library/content/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW2

这是一些代码:

// make input device
NSError *deviceError;
AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *inputDevice = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:&deviceError];

// make output device
AVCaptureVideoDataOutput *outputDevice = [[AVCaptureVideoDataOutput alloc] init];
[outputDevice setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

// initialize capture session
AVCaptureSession *captureSession = [[[AVCaptureSession alloc] init] autorelease];
[captureSession addInput:inputDevice];
[captureSession addOutput:outputDevice];

// make preview layer and add so that camera's view is displayed on screen
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.frame = view.bounds;
[view.layer addSublayer:previewLayer];

// go!
[captureSession startRunning];

然后输出设备的委托(这里是self)必须实现回调:

Then the output device's delegate (here, self) has to implement the callback:

-(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer( sampleBuffer );
    CGSize imageSize = CVImageBufferGetEncodedSize( imageBuffer );
    // also in the 'mediaSpecific' dict of the sampleBuffer

   NSLog( @"frame captured at %.fx%.f", imageSize.width, imageSize.height );
}

编辑/更新

有几个人问过如何做到这一点,而无需将帧一一发送到服务器.答案很复杂...

EDIT/UPDATE

Several people have asked how to do this without sending the frames to the server one by one. The answer is complex...

基本上,在上面的 didOutputSampleBuffer 函数中,您将样本添加到 AVAssetWriter 中.实际上,我同时有 3 个资产编写者活跃——过去、现在和未来——在不同的线程上进行管理.

Basically, in the didOutputSampleBuffer function above, you add the samples into an AVAssetWriter. I actually had three asset writers active at a time -- past, present, and future -- managed on different threads.

过去的作者正在关闭电影文件并上传它.当前编写器正在从相机接收样本缓冲区.未来的作者正在打开一个新的电影文件并准备数据.每 5 秒,我设置 past=current;current=future 并重新启动序列.

The past writer is in the process of closing the movie file and uploading it. The current writer is receiving the sample buffers from the camera. The future writer is in the process of opening a new movie file and preparing it for data. Every 5 seconds, I set past=current; current=future and restart the sequence.

然后将视频以 5 秒的块上传到服务器.如果需要,您可以使用 ffmpeg 将视频拼接在一起,或者将它们转码为 MPEG-2 传输流以进行 HTTP Live Streaming.视频数据本身是由资产编写者进行 H.264 编码的,因此转码只会更改文件的标题格式.

This then uploads video in 5-second chunks to the server. You can stitch the videos together with ffmpeg if you want, or transcode them into MPEG-2 transport streams for HTTP Live Streaming. The video data itself is H.264-encoded by the asset writer, so transcoding merely changes the file's header format.

这篇关于从 iPhone 上传实时流媒体视频,如 Ustream 或 Qik的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆