Quickblox视频聊天保存 [英] Quickblox video chat saving

查看:78
本文介绍了Quickblox视频聊天保存的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用 QuickBlox iOS SDK在我的应用程序中进行视频聊天。它工作正常。现在,我想录制聊天视频并将其保存在相机胶卷中。我怎样才能做到这一点。
我浏览了他们的文档并实现了此功能-

I am using QuickBlox iOS SDK for vidoe chating in my app. It works fine. Now I want to record the chat video and save it in camera roll. How can I do that. I have gone through their documentation and implemented this -

 -(IBAction)record:(id)sender{


   // Create video Chat
   videoChat = [[QBChat instance] createAndRegisterVideoChatInstance];
   [videoChat setIsUseCustomVideoChatCaptureSession:YES];

   // Create capture session
    captureSession = [[AVCaptureSession alloc] init];

   // ... setup capture session here

   /*We create a serial queue to handle the processing of our frames*/
   dispatch_queue_t callbackQueue= dispatch_queue_create("cameraQueue", NULL);
  [videoCaptureOutput setSampleBufferDelegate:self queue:callbackQueue];

  /*We start the capture*/
  [captureSession startRunning];
   }

 -(void)captureOutput:(AVCaptureOutput *)captureOutput  didOutputSampleBuffer: (CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

  // Do something with samples
  // ...

  // forward video samples to SDK
  [videoChat processVideoChatCaptureVideoSample:sampleBuffer];
 }

但是我不确定从这里开始怎么做。
我应该如何获取视频数据?

But I am not sure what to do from here. How should I get the video data ?

推荐答案

从quickblox 文档

From the quickblox docs

要设置自定义视频捕获会话,您只需遵循以下步骤步骤:

To setup a custom video capture session you simply follow these steps:

创建一个AVCaptureSession实例
设置输入和输出
实施框架回调并将所有框架转发到QuickBlox iOS SDK
告诉QuickBlox SDK您将使用自己的捕获会话

create an instance of AVCaptureSession setup the input and output implement frames callback and forward all frames to the QuickBlox iOS SDK tell the QuickBlox SDK that you will use your own capture session

要设置自定义视频捕获会话,请设置输入和输出:

To setup a custom video capture session, setup input and output:

-(void) setupVideoCapture{
self.captureSession = [[AVCaptureSession alloc] init];

__block NSError *error = nil;

// set preset
[self.captureSession setSessionPreset:AVCaptureSessionPresetLow];


// Setup the Video input
AVCaptureDevice *videoDevice = [self frontFacingCamera];
//
AVCaptureDeviceInput *captureVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if(error){
    QBDLogEx(@"deviceInputWithDevice Video error: %@", error);
}else{
    if ([self.captureSession  canAddInput:captureVideoInput]){
        [self.captureSession addInput:captureVideoInput];
    }else{
        QBDLogEx(@"cantAddInput Video");
    }
}

// Setup Video output
AVCaptureVideoDataOutput *videoCaptureOutput = [[AVCaptureVideoDataOutput alloc] init];
videoCaptureOutput.alwaysDiscardsLateVideoFrames = YES;
//
// Set the video output to store frame in BGRA (It is supposed to be faster)
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[videoCaptureOutput setVideoSettings:videoSettings];
/*And we create a capture session*/
if([self.captureSession canAddOutput:videoCaptureOutput]){
    [self.captureSession addOutput:videoCaptureOutput];
}else{
    QBDLogEx(@"cantAddOutput");
}
[videoCaptureOutput release];


// set FPS
int framesPerSecond = 3;
AVCaptureConnection *conn = [videoCaptureOutput connectionWithMediaType:AVMediaTypeVideo];
if (conn.isVideoMinFrameDurationSupported){
    conn.videoMinFrameDuration = CMTimeMake(1, framesPerSecond);
}
if (conn.isVideoMaxFrameDurationSupported){
    conn.videoMaxFrameDuration = CMTimeMake(1, framesPerSecond);
}

/*We create a serial queue to handle the processing of our frames*/
dispatch_queue_t callbackQueue= dispatch_queue_create("cameraQueue", NULL);
[videoCaptureOutput setSampleBufferDelegate:self queue:callbackQueue];
dispatch_release(callbackQueue);

// Add preview layer
AVCaptureVideoPreviewLayer *prewLayer = [[[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession] autorelease];
[prewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CGRect layerRect = [[myVideoView layer] bounds];
[prewLayer setBounds:layerRect];
[prewLayer setPosition:CGPointMake(CGRectGetMidX(layerRect),CGRectGetMidY(layerRect))];
myVideoView.hidden = NO;
[myVideoView.layer addSublayer:prewLayer];


/*We start the capture*/
[self.captureSession startRunning];
}

- (AVCaptureDevice *) cameraWithPosition:(AVCaptureDevicePosition) position{
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice *device in devices) {
        if ([device position] == position) {
            return device;
        }
    }
    return nil;
}


- (AVCaptureDevice *) backFacingCamera{
    return [self cameraWithPosition:AVCaptureDevicePositionBack];
}

- (AVCaptureDevice *) frontFacingCamera{
    return [self cameraWithPosition:AVCaptureDevicePositionFront];
}

执行框架回调:

- (void)captureOutput:(AVCaptureOutput *)captureOutput  didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

    // Usually we just forward camera frames to QuickBlox SDK
    // But we also can do something with them before, for example - apply some video filters or so  
    [self.videoChat processVideoChatCaptureVideoSample:sampleBuffer];
}

告诉QuickBlox iOS SDK我们使用自己的视频捕获会话:

Tell to QuickBlox iOS SDK that we use our own video capture session:

self.videoChat = [[QBChat instance] createAndRegisterVideoChatInstance];
self.videoChat.viewToRenderOpponentVideoStream = opponentVideoView;
//
// we use own video capture session
self.videoChat.isUseCustomVideoChatCaptureSession = YES;

这篇关于Quickblox视频聊天保存的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆