无法使用AVCapturePhotoOutput捕获照片swift + xcode [英] Unable to use AVCapturePhotoOutput to capture photo swift + xcode
问题描述
我正在开发一个自定义相机应用程序,该教程使用了AVCaptureStillImageOutput,这是ios 10不推荐使用的。我已经设置了相机,现在我不知道如何拍摄照片
I am working on a custom camera app and the tutorial uses AVCaptureStillImageOutput, which is deprecated for ios 10. I have set up the camera and am now stuck on how to take the photo
这是我的完整视图,我有相机
Here is my full view where i have the camera
import UIKit
import AVFoundation
var cameraPos = "back"
class View3: UIViewController,UIImagePickerControllerDelegate,UINavigationControllerDelegate {
@IBOutlet weak var clickButton: UIButton!
@IBOutlet var cameraView: UIView!
var session: AVCaptureSession?
var stillImageOutput: AVCapturePhotoOutput?
var videoPreviewLayer: AVCaptureVideoPreviewLayer?
override func viewDidLoad() {
super.viewDidLoad()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
clickButton.center.x = cameraView.bounds.width/2
loadCamera()
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
}
@IBAction func clickCapture(_ sender: UIButton) {
if let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo) {
// This is where I need help
}
}
@IBAction func changeDevice(_ sender: UIButton) {
if cameraPos == "back"
{cameraPos = "front"}
else
{cameraPos = "back"}
loadCamera()
}
func loadCamera()
{
session?.stopRunning()
videoPreviewLayer?.removeFromSuperlayer()
session = AVCaptureSession()
session!.sessionPreset = AVCaptureSessionPresetPhoto
var backCamera = AVCaptureDevice.defaultDevice(withDeviceType: .builtInWideAngleCamera, mediaType: AVMediaTypeVideo, position: .front)
if cameraPos == "back"
{
backCamera = AVCaptureDevice.defaultDevice(withDeviceType: .builtInWideAngleCamera, mediaType: AVMediaTypeVideo, position: .back)
}
var error: NSError?
var input: AVCaptureDeviceInput!
do {
input = try AVCaptureDeviceInput(device: backCamera)
} catch let error1 as NSError {
error = error1
input = nil
print(error!.localizedDescription)
}
if error == nil && session!.canAddInput(input) {
session!.addInput(input)
stillImageOutput = AVCapturePhotoOutput()
if session!.canAddOutput(stillImageOutput) {
session!.addOutput(stillImageOutput)
videoPreviewLayer = AVCaptureVideoPreviewLayer(session: session)
videoPreviewLayer?.frame = cameraView.bounds
videoPreviewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
videoPreviewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.portrait
cameraView.layer.addSublayer(videoPreviewLayer!)
session!.startRunning()
} }
}
}
这是我需要帮助的地方
@IBAction func clickCapture(_ sender: UIButton) {
if let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo) {
// This is where I need help
}
}
我在这里找到了答案如何使用AVCapturePhotoOutput
但我不明白如何将此代码合并到此代码中,如它涉及声明一个新类
I have gone through the answer here How to use AVCapturePhotoOutput but i do not understand how to incorporate that code in this code, as it involves declaring a new class
推荐答案
你几乎就在那里。
查看 AVCapturePhotoOutput
< a href =https://developer.apple.com/reference/avfoundation/avcapturephotooutput\"rel =noreferrer>文档以获取更多帮助。
Check out AVCapturePhotoOutput
documentation for more help.
以下是捕获照片的步骤。
These are the steps to capture a photo.
- 创建
AVCapturePhotoOutput
对象。使用其属性
确定支持的捕获设置并启用某些功能
(例如,是否捕获实时照片)。 - 创建并配置
AVCapturePhotoSettings
对象选择特定捕获的
功能和设置(例如,无论是
来启用图像稳定还是闪存)。 - 通过将照片设置对象传递到
capturePhoto(带:delegate:)
方法以及委托对象
来实现AVCapturePhotoCaptureDelegate
协议。照片
捕获输出然后调用您的代理人在捕获过程中通知您重要的
事件。
- Create an
AVCapturePhotoOutput
object. Use its properties to determine supported capture settings and to enable certain features (for example, whether to capture Live Photos). - Create and configure an
AVCapturePhotoSettings
object to choose features and settings for a specific capture (for example, whether to enable image stabilization or flash). - Capture an image by passing your photo settings object to the
capturePhoto(with:delegate:)
method along with a delegate object implementing theAVCapturePhotoCaptureDelegate
protocol. The photo capture output then calls your delegate to notify you of significant events during the capture process.
在您的 clickCapture
方法中使用以下代码,并且不要忘记在您的课程中确认并实施委托。
have this below code on your clickCapture
method and don't forgot to confirm and implement to delegate in your class.
let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
kCVPixelBufferWidthKey as String: 160,
kCVPixelBufferHeightKey as String: 160,
]
settings.previewPhotoFormat = previewFormat
self.cameraOutput.capturePhoto(with: settings, delegate: self)
输出为 AVCaptureStillImageOutput
如果您打算从视频连接中拍摄照片。您可以按照以下步骤操作。
For Output as AVCaptureStillImageOutput
if you intend to snap a photo from video connection. you can follow the below steps.
第1步:获取连接
if let videoConnection = stillImageOutput!.connectionWithMediaType(AVMediaTypeVideo) {
// ...
// Code for photo capture goes here...
}
第2步:拍摄照片
- 在
上调用captureStillImageAsynchronouslyFromConnection
函数stillImageOutput
。 -
sampleBuffer
表示捕获的数据。
- Call the
captureStillImageAsynchronouslyFromConnection
function on thestillImageOutput
. - The
sampleBuffer
represents the data that is captured.
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in
// ...
// Process the image data (sampleBuffer) here to get an image file we can put in our captureImageView
})
第3步:处理图像数据
- 我们需要采取一些步骤来处理sampleBuffer中的图像数据,以便最终得到一个UIImage,我们可以将其插入到captureImageView中并轻松地在其他地方使用我们的应用。
if sampleBuffer != nil {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
let dataProvider = CGDataProviderCreateWithCFData(imageData)
let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)
let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right)
// ...
// Add the image to captureImageView here...
}
第4步:保存图片
根据您的需要将图像保存到照片库或在图像视图中显示
Based on your need either save the image to photos gallery or show that in a image view
更多细节ch在下创建自定义相机视图指南拍照片
For more details check out Create custom camera view guide under Snap a Photo
这篇关于无法使用AVCapturePhotoOutput捕获照片swift + xcode的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!