从Ustream或Qik上传来自iPhone的直播视频 [英] Upload live streaming video from iPhone like Ustream or Qik

查看:196
本文介绍了从Ustream或Qik上传来自iPhone的直播视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如何直播视频从iPhone到服务器,如Ustream或Qik?我知道苹果有一种叫做Http Live Streaming的东西,但我发现的大多数资源只谈到从服务器到iPhone的视频流。

How to live stream videos from iPhone to server like Ustream or Qik? I know there's something called Http Live Streaming from Apple, but most resources I found only talks about streaming videos from server to iPhone.

Apple的Http Living Streaming是我应该使用的吗?或者是其他东西?谢谢。

Is Apple's Http Living Streaming something I should use? Or something else? Thanks.

推荐答案

据我所知,没有内置的方法可以做到这一点。正如你所说,HTTP Live Streaming用于下载到iPhone。

There isn't a built-in way to do this, as far as I know. As you say, HTTP Live Streaming is for downloads to the iPhone.

我这样做的方法是实现一个AVCaptureSession,它有一个带有回调的委托在每一帧上运行。该回调通过网络将每个帧发送到服务器,该服务器具有接收它的自定义设置。

The way I'm doing it is to implement an AVCaptureSession, which has a delegate with a callback that's run on every frame. That callback sends each frame over the network to the server, which has a custom setup to receive it.

以下是流程: https:// developer。 apple.com/library/content/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW2

这里有一些代码:

// make input device
NSError *deviceError;
AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *inputDevice = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:&deviceError];

// make output device
AVCaptureVideoDataOutput *outputDevice = [[AVCaptureVideoDataOutput alloc] init];
[outputDevice setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

// initialize capture session
AVCaptureSession *captureSession = [[[AVCaptureSession alloc] init] autorelease];
[captureSession addInput:inputDevice];
[captureSession addOutput:outputDevice];

// make preview layer and add so that camera's view is displayed on screen
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.frame = view.bounds;
[view.layer addSublayer:previewLayer];

// go!
[captureSession startRunning];

然后输出设备的委托(此处为self)必须实现回调:

Then the output device's delegate (here, self) has to implement the callback:

-(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer( sampleBuffer );
    CGSize imageSize = CVImageBufferGetEncodedSize( imageBuffer );
    // also in the 'mediaSpecific' dict of the sampleBuffer

   NSLog( @"frame captured at %.fx%.f", imageSize.width, imageSize.height );
}



编辑/更新



有几个人已经问过如何在不将帧逐个发送到服务器的情况下执行此操作。答案很复杂......

EDIT/UPDATE

Several people have asked how to do this without sending the frames to the server one by one. The answer is complex...

基本上,在上面的 didOutputSampleBuffer 函数中,您将样本添加到 AVAssetWriter 。实际上我有三个资产编写者一次活动 - 过去,现在和将来 - 在不同的线程上管理。

Basically, in the didOutputSampleBuffer function above, you add the samples into an AVAssetWriter. I actually had three asset writers active at a time -- past, present, and future -- managed on different threads.

过去的作者正在关闭电影文件并上传它。当前作者正在从相机接收样本缓冲区。未来的作家正在打开一个新的电影文件并为数据做准备。每隔5秒,我设置过去=当前; current = future 并重新启动序列。

The past writer is in the process of closing the movie file and uploading it. The current writer is receiving the sample buffers from the camera. The future writer is in the process of opening a new movie file and preparing it for data. Every 5 seconds, I set past=current; current=future and restart the sequence.

然后以5秒块的形式将视频上传到服务器。如果需要,您可以将视频与 ffmpeg 拼接在一起,或者将它们转码为MPEG-2传输流以进行HTTP直播。视频数据本身由资产编写者进行H.264编码,因此转码只会改变文件的标题格式。

This then uploads video in 5-second chunks to the server. You can stitch the videos together with ffmpeg if you want, or transcode them into MPEG-2 transport streams for HTTP Live Streaming. The video data itself is H.264-encoded by the asset writer, so transcoding merely changes the file's header format.

这篇关于从Ustream或Qik上传来自iPhone的直播视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆