使用AVFoundation捕获静止图像 [英] Capturing still image with AVFoundation

查看:142
本文介绍了使用AVFoundation捕获静止图像的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我当前正在创建一个简单的应用程序,该应用程序使用AVFoundation将视频流式传输到UIImageView.

I'm currently creating a simple application which uses AVFoundation to stream video into a UIImageView.

为此,我创建了AVCaptureSession()的实例和AVCaptureSessionPreset()的实例:

To achieve this, I created an instance of AVCaptureSession() and an AVCaptureSessionPreset():

let input = try AVCaptureDeviceInput(device: device)
                print(input)
                if (captureSession.canAddInput(input)) {
                    captureSession.addInput(input)

                    if (captureSession.canAddOutput(sessionOutput)) {
                        captureSession.addOutput(sessionOutput)
                        previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
                        previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
                        previewLayer.connection.videoOrientation = AVCaptureVideoOrientation.portrait

                        cameraView.layer.addSublayer(previewLayer)
                        captureSession.startRunning()

cameraView引用UIImageView插座.

我现在想实现一种从AVCaptureSession捕获静止图像的方法.

I now want to implement a way of capturing a still image from the AVCaptureSession.

如果有更有效的方法可以纠正我,但我计划再有一个UIImageView来将静止图像放置在保存视频的UIImageView顶部?

Correct me if theres a more efficient way, but I plan to have an additional UIImageView to hold the still image placed on top of the UIImageView which holds the video?

我创建了一个带有操作的按钮:

I've created a button with action:

@IBAction func takePhoto(_sender: Any) {
    // functionality to obtain still image 
}

我的问题是,我不确定如何从捕获会话中实际获取静止图像并用它填充新的UIImageView.

My issue is, I'm unsure how to actually obtain a still image from the capture session and populate the new UIImageView with it.

在查看Stack上发布的信息/问题之后,大多数解决方案将使用:

After looking at information/questions posted on Stack, the majority of the solutions is to use:

captureStillImageAsynchronouslyFromConnection

我不确定是否只是Swift 3.0,但xCode无法识别此功能.

I'm unsure if it's just Swift 3.0 but xCode isn't recognising this function.

有人可以向我建议如何真正实现单击按钮时获取并显示静止图像的结果.

Could someone please advise me on how to actually achieve the result of obtaining and displaying a still image upon button click.

此处是我的完整代码的链接,用于更好地理解我的程序.

Here is a link to my full code for better understanding of my program.

在此先感谢大家花时间阅读我的问题,如果我错过了一些相关数据,请随时告诉我.

Thank you all in advance for taking the time to read my question and please feel free to tell me in case i've missed out some relevant data.

推荐答案

(如果您定位的是iOS 10或更高版本). captureStillImageAsynchronously(from:completionHandler:)AVCaptureStillImageOutput一起已被弃用.

if you are targeting iOS 10 or above. captureStillImageAsynchronously(from:completionHandler:) is deprecated along with AVCaptureStillImageOutput.

根据文档

iOS 10.0中不推荐使用AVCaptureStillImageOutput类,并且 不支持较新的相机拍摄功能,例如RAW图像输出, 实时照片或广色域颜色.在iOS 10.0及更高版本中,使用 而是使用AVCapturePhotoOutput类. (AVCaptureStillImageOutput macOS 10.12中仍支持该类.)

The AVCaptureStillImageOutput class is deprecated in iOS 10.0 and does not support newer camera capture features such as RAW image output, Live Photos, or wide-gamut color. In iOS 10.0 and later, use the AVCapturePhotoOutput class instead. (The AVCaptureStillImageOutput class remains supported in macOS 10.12.)

根据您的代码,您已经在使用AVCapturePhotoOutput.因此,只需按照以下步骤从会话中拍照即可.可以在 Apple文档中找到.

As per your code you are already using AVCapturePhotoOutput. So just follow these below steps to take a photo from session. Same can be found here in Apple documentation.

  1. 创建一个AVCapturePhotoOutput对象.使用其属性来确定支持的捕获设置并启用某些功能(例如,是否捕获实时照片).
  2. 创建并配置AVCapturePhotoSettings对象,以选择特定捕获的功能和设置(例如,启用图像稳定还是闪光灯).
  3. 通过将照片设置对象与实现AVCapturePhotoCaptureDelegate协议的委托对象一起传递到capturePhoto(with:delegate :)方法来捕获图像.然后,照片捕获输出将呼叫您的代表,以在捕获过程中将重大事件通知您.

您已经在执行步骤1和2.因此,请在代码中添加此行

you are already doing step 1 and 2. So add this line in your code

@IBAction func takePhoto(_sender: Any) {
        print("Taking Photo")
        sessionOutput.capturePhoto(with: sessionOutputSetting, delegate: self as! AVCapturePhotoCaptureDelegate)

    }

并实现AVCapturePhotoCaptureDelegate功能

optional public func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?)

请注意,此代表将对拍照有很多控制权.请查阅文档以获取更多功能.另外,您还需要处理图像数据,这意味着您必须将样本缓冲区转换为UIImage.

Note that this delegate will give lots of control over taking photos. Check out the documentation for more functions. Also you need to process the image data which means you have to convert the sample buffer to UIImage.

if sampleBuffer != nil {
  let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
  let dataProvider = CGDataProviderCreateWithCFData(imageData)
  let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
  let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
  // ...
  // Add the image to captureImageView here...
}

请注意,您获得的图像向左旋转,因此我们必须手动向右旋转,以便获得像图像一样的预览.

Note that the image you get is rotated left so we have to manually rotate right so get preview like image.

更多信息可以在我以前的SO中找到

More info can be found in my previous SO answer

这篇关于使用AVFoundation捕获静止图像的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆