Swift:从AVFoundation拍照 [英] Swift : take photo from AVFoundation

查看:67
本文介绍了Swift:从AVFoundation拍照的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想将此功能从obj c转换为swift,但我无法转换部分代码. 有人可以解释一下我如何从AVFondation拍照或帮助我翻译此功能吗?

I want to translate this func from obj c to swift, buti can't translate a part of the code. Could someone explain me how to take photo from AVFondation or help me to translate this function ?

- (void) capImage { //method to capture image from AVCaptureSession      video feed
     AVCaptureConnection *videoConnection = nil;
     for (AVCaptureConnection *connection in stillImageOutput.connections) {

  for (AVCaptureInputPort *port in [connection inputPorts]) {

  if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
    videoConnection = connection;
        break;
  }
  }

   if (videoConnection) {
    break;
 }
}

 NSLog(@"about to request a capture from: %@", stillImageOutput);
[stillImageOutput     captureStillImageAsynchronouslyFromConnection:videoConnection          completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {

  if (imageSampleBuffer != NULL) {
   NSData *imageData = [AVCaptureStillImageOutput        jpegStillImageNSDataRepresentation:imageSampleBuffer];
[self processImage:[UIImage imageWithData:imageData]];
 }
}]; }

我做了什么但没有工作:

What i did but not working :

  func takePhoto(sender:UIButton){

  var videoConnection:AVCaptureConnection
  var connection:AVCaptureConnection
   var port : AVCaptureInputPort

 for connection in stillImageOutput?.connections {

for (port in connection.inputPorts as AVCaptureInputPort) {


    if port = AVMediaTypeVideo {
        videoConnection = connection
        break
    }

}

    if videoConnection {
        break
   }


 }
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection,      completionHandler: {(imageSampleBuffer, error) in
if (imageSampleBuffer != nil) {
    var imageData =      AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageSampleBuffer as CMSampleBuffer)
    var image: UIImage = UIImage(data: imageData)


}
})


}

有人可以帮我吗?

推荐答案

确定找到了解决方法:

  func takePhoto(){




if let stillOutput = self.stillImageOutput {
    // we do this on another thread so that we don't hang the UI
    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)) {
        //find the video connection
        var videoConnection : AVCaptureConnection?
        for connecton in stillOutput.connections {
            //find a matching input port
            for port in connecton.inputPorts!{
                if port.mediaType == AVMediaTypeVideo {
                    videoConnection = connecton as? AVCaptureConnection
                    break //for port
                }
            }

            if videoConnection  != nil {
                break// for connections
            }
        }
        if videoConnection  != nil {
            stillOutput.captureStillImageAsynchronouslyFromConnection(videoConnection){
                (imageSampleBuffer : CMSampleBuffer!, _) in

                let imageDataJpeg = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageSampleBuffer)
                var pickedImage: UIImage = UIImage(data: imageDataJpeg)



            }
            self.captureSession.stopRunning()



        }
    }
}

}

这篇关于Swift:从AVFoundation拍照的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆