Swift中的自定义相机和裁剪图像 [英] Custom Camera and Crop image in Swift

查看:788
本文介绍了Swift中的自定义相机和裁剪图像的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我创建了一个自定义相机,并在下面的代码中实现了对所拍摄图像的裁剪,我在预览层中显示了指南,因此我想对出现在该区域中的图像进行裁剪.

I have created a custom camera and have implemented below code to crop the taken image, I have shown guides in the preview layer so I want to crop the image which appears in that area.

func imageByCropToRect(rect:CGRect, scale:Bool) -> UIImage {    
    var rect = rect
    var scaleFactor: CGFloat = 1.0
    if scale  {
        scaleFactor = self.scale
        rect.origin.x *= scaleFactor
        rect.origin.y *= scaleFactor
        rect.size.width *= scaleFactor
        rect.size.height *= scaleFactor
    }

    var image: UIImage? = nil;
    if rect.size.width > 0 && rect.size.height > 0 {
        let imageRef = self.cgImage!.cropping(to: rect)
        image = UIImage(cgImage: imageRef!, scale: scaleFactor, orientation: self.imageOrientation)
    }

    return image!
}

当&在注释下面的代码行时,给出准确的裁剪图像,尽管我希望图像流为全屏显示,所以我必须使用下面的代码行.图像缩小了.

This code just works fine when & give the exact cropped image when the below line of code is commented, though I want the image streaming to be full screen so I have to use the below line of code. The image comes zoomed out sort of.

(self.previewLayer as! AVCaptureVideoPreviewLayer).videoGravity = AVLayerVideoGravity.resizeAspectFill

如何解决此问题?裁剪代码不正确吗?

How do I solve this issue? Is the cropping code wrong?

这是完整的班级代码

import UIKit
import AVFoundation

class CameraViewController: UIViewController {

    @IBOutlet weak var guideImageView: UIImageView!
    @IBOutlet weak var guidesView: UIView!
    @IBOutlet weak var cameraPreviewView: UIView!
    @IBOutlet weak var cameraButtonView: UIView!

    @IBOutlet weak var captureButton: UIButton!

    var captureSession = AVCaptureSession()
    var previewLayer: CALayer!
    var captureDevice: AVCaptureDevice!

    /// This will be true when the user clicks on the click photo button.
    var takePhoto = false

    override func viewDidLoad() {
        super.viewDidLoad()
    }

    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)

        captureSession = AVCaptureSession()
        previewLayer = CALayer()
        takePhoto = false

        requestAuthorization()
    }

    private func userinteractionToButton(_ interaction: Bool) {
        captureButton.isEnabled = interaction
    }

    /// This function will request authorization, If authorized then start the camera.
    private func requestAuthorization() {
        switch AVCaptureDevice.authorizationStatus(for: AVMediaType.video) {
        case .authorized:
            prepareCamera()

        case .denied, .restricted, .notDetermined:
            AVCaptureDevice.requestAccess(for: AVMediaType.video, completionHandler: { (granted) in
                if !Thread.isMainThread {
                    DispatchQueue.main.async {
                        if granted {
                            self.prepareCamera()
                        } else {
                            let alert = UIAlertController(title: "unable_to_access_the_Camera", message: "to_enable_access_go_to_setting_privacy_camera_and_turn_on_camera_access_for_this_app", preferredStyle: UIAlertControllerStyle.alert)
                            alert.addAction(UIAlertAction(title: "ok", style: .default, handler: {_ in
                                self.navigationController?.popToRootViewController(animated: true)
                            }))
                            self.present(alert, animated: true, completion: nil)
                        }
                    }
                } else {
                    if granted {
                        self.prepareCamera()
                    } else {
                        let alert = UIAlertController(title: "unable_to_access_the_Camera", message: "to_enable_access_go_to_setting_privacy_camera_and_turn_on_camera_access_for_this_app", preferredStyle: UIAlertControllerStyle.alert)
                        alert.addAction(UIAlertAction(title: "ok", style: .default, handler: {_ in
                            self.navigationController?.popToRootViewController(animated: true)
                        }))
                        self.present(alert, animated: true, completion: nil)
                    }
                }
            })
        }
    }

    /// Will see if the primary camera is avilable, If found will call method which will asign the available device to the AVCaptureDevice.
    private func prepareCamera() {
        // Resets the session.
        self.captureSession.sessionPreset = AVCaptureSession.Preset.photo

        if #available(iOS 10.0, *) {
            let availableDevices = AVCaptureDevice.DiscoverySession(deviceTypes: [AVCaptureDevice.DeviceType.builtInWideAngleCamera], mediaType: AVMediaType.video, position: .back).devices
            self.assignCamera(availableDevices)
        } else {
            // Fallback on earlier versions
            // development, need to test this on iOS 8
            if let availableDevices = AVCaptureDevice.default(for: AVMediaType.video) {
                self.assignCamera([availableDevices])
            } else {
                self.showAlert()
            }
        }
    }

    /// Assigns AVCaptureDevice to the respected the variable, will begin the session.
    ///
    /// - Parameter availableDevices: [AVCaptureDevice]
    private func assignCamera(_ availableDevices: [AVCaptureDevice]) {
        if availableDevices.first != nil {
            captureDevice = availableDevices.first
            beginSession()
        } else {
            self.showAlert()
        }
    }

    /// Configures the camera settings and begins the session, this function will be responsible for showing the image on the UI.
    private func beginSession() {
        do {
            let captureDeviceInput = try AVCaptureDeviceInput(device: captureDevice)
            captureSession.addInput(captureDeviceInput)
        } catch {
            print(error.localizedDescription)
        }

        let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        self.previewLayer = previewLayer
        self.cameraPreviewView.layer.addSublayer(self.previewLayer)
        self.previewLayer.frame = self.view.layer.frame
        self.previewLayer.frame.origin.y = +self.cameraPreviewView.frame.origin.y
        (self.previewLayer as! AVCaptureVideoPreviewLayer).videoGravity = AVLayerVideoGravity.resizeAspectFill
        self.previewLayer.masksToBounds = true
        self.cameraPreviewView.clipsToBounds = true
        captureSession.startRunning()

        self.view.bringSubview(toFront: self.cameraPreviewView)
        self.view.bringSubview(toFront: self.cameraButtonView)
        self.view.bringSubview(toFront: self.guidesView)

        let dataOutput = AVCaptureVideoDataOutput()
        dataOutput.videoSettings = [((kCVPixelBufferPixelFormatTypeKey as NSString) as String):NSNumber(value:kCVPixelFormatType_32BGRA)]

        dataOutput.alwaysDiscardsLateVideoFrames = true

        if captureSession.canAddOutput(dataOutput) {
            captureSession.addOutput(dataOutput)
        }

        captureSession.commitConfiguration()

        let queue = DispatchQueue(label: "com.letsappit.camera")
        dataOutput.setSampleBufferDelegate(self, queue: queue)

        self.userinteractionToButton(true)
    }


    /// Get the UIImage from the given CMSampleBuffer.
    ///
    /// - Parameter buffer: CMSampleBuffer
    /// - Returns: UIImage?
    func getImageFromSampleBuffer(buffer:CMSampleBuffer, orientation: UIImageOrientation) -> UIImage? {
        if let pixelBuffer = CMSampleBufferGetImageBuffer(buffer) {
            let ciImage = CIImage(cvPixelBuffer: pixelBuffer)
            let context = CIContext()
            let imageRect = CGRect(x: 0, y: 0, width: CVPixelBufferGetWidth(pixelBuffer), height: CVPixelBufferGetHeight(pixelBuffer))

            if let image = context.createCGImage(ciImage, from: imageRect) {
                return UIImage(cgImage: image, scale: UIScreen.main.scale, orientation: orientation)

            }

        }
        return nil
    }

    /// This function will destroy the capture session.
    func stopCaptureSession() {
        self.captureSession.stopRunning()

        if let inputs = captureSession.inputs as? [AVCaptureDeviceInput] {
            for input in inputs {
                self.captureSession.removeInput(input)
            }
        }
    }

    func showAlert() {
        let alert = UIAlertController(title: "Unable to access the camera", message: "It appears that either your device doesn't have camera or its broken", preferredStyle: .alert)
        alert.addAction(UIAlertAction(title: "cancel", style: .cancel, handler: {_ in
            self.navigationController?.dismiss(animated: true, completion: nil)
        }))
        self.present(alert, animated: true, completion: nil)
    }

    @IBAction func didTapClick(_ sender: Any) {
        userinteractionToButton(false)
        takePhoto = true
    }

    override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
        if segue.identifier == "showImage" {
            let vc = segue.destination as! ShowImageViewController
            vc.image = sender as! UIImage
        }
    }
}

extension CameraViewController: AVCaptureVideoDataOutputSampleBufferDelegate {
    func captureOutput(_ captureOutput: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {

        if connection.isVideoOrientationSupported {
            connection.videoOrientation = .portrait
        }

        if takePhoto {
            takePhoto = false

            // Rotation should be unlocked to work.
            var orientation = UIImageOrientation.up
            switch UIDevice.current.orientation {
            case .landscapeLeft:
                orientation = .left

            case .landscapeRight:
                orientation = .right

            case .portraitUpsideDown:
                orientation = .down

            default:
                orientation = .up
            }

            if let image = self.getImageFromSampleBuffer(buffer: sampleBuffer, orientation: orientation) {
                DispatchQueue.main.async {
                    let newImage = image.imageByCropToRect(rect: self.guideImageView.frame, scale: true)
                    self.stopCaptureSession()
                    self.previewLayer.removeFromSuperlayer()
                    self.performSegue(withIdentifier: "showImage", sender: newImage)
                }
            }
        }
    }
}

这是视图层次结构图像

Here is the view hierarchy image

推荐答案

要校正缩小的裁剪图像,您必须使用图像方向将裁剪功能更改为此.

For the correction of zooming out of cropped image, you have to change your crop function to this by using image orientation.

func croppedInRect(rect: CGRect) -> UIImage? {
    func rad(_ degree: Double) -> CGFloat {
        return CGFloat(degree / 180.0 * .pi)
    }

    var rectTransform: CGAffineTransform
    switch imageOrientation {
    case .left:
        rectTransform = CGAffineTransform(rotationAngle: rad(90)).translatedBy(x: 0, y: -self.size.height)
    case .right:
        rectTransform = CGAffineTransform(rotationAngle: rad(-90)).translatedBy(x: -self.size.width, y: 0)
    case .down:
        rectTransform = CGAffineTransform(rotationAngle: rad(-180)).translatedBy(x: -self.size.width, y: -self.size.height)
    default:
        rectTransform = .identity
    }
    rectTransform = rectTransform.scaledBy(x: self.scale, y: self.scale)

    var cgImage = self.cgImage

    if cgImage == nil{

        let ciContext = CIContext()
        if let ciImage = self.ciImage{
            cgImage = ciContext.createCGImage(ciImage, from: ciImage.extent)
        }
    }

    if let imageRef = cgImage?.cropping(to: rect.applying(rectTransform)){
        let result = UIImage(cgImage: imageRef, scale: self.scale, orientation: self.imageOrientation)
        return result
    }


    return nil

}

这篇关于Swift中的自定义相机和裁剪图像的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆