使用AVFoundation捕获视频 [英] Capturing Video with AVFoundation

查看:140
本文介绍了使用AVFoundation捕获视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直在寻找Stack,我发现了类似的问题,但没有一个对我有用。我是Swift 3.0的新手。基本上我要做的是使用AVFoundation录制视频。到目前为止,我已设法捕获静止图像,这是我到目前为止的代码

I've been looking around on Stack and I have found similar questions to this, but none have worked for me. I am a complete novice to Swift 3.0. Essentially what I'm trying to do is record a video using AVFoundation. So far I have managed to capture a still image, and this is the code I have so far

func beginSession() {
    do {
        let deviceInput = try  AVCaptureDeviceInput(device: captureDevice) as AVCaptureDeviceInput
        if captureSession.inputs.isEmpty {
            self.captureSession.addInput(deviceInput)
        }
        stillImageOutput.outputSettings = [AVVideoCodecKey:AVVideoCodecJPEG]

        if captureSession.canAddOutput(stillImageOutput) {
            captureSession.addOutput(stillImageOutput)
        }

    }
    catch {
        print("error: \(error.localizedDescription)")
    }

    guard let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) else {
        print("no preview layer")
        return
    }

    self.view.layer.addSublayer(previewLayer)
    previewLayer.frame = self.view.layer.frame
    captureSession.startRunning()

    // Subviews
    self.view.addSubview(imgOverlay)
    self.view.addSubview(blur)
    self.view.addSubview(label)
    self.view.addSubview(Flip)
    self.view.addSubview(btnCapture)
}

 // SAVE PHOTO
func saveToCamera() {
    if let videoConnection = stillImageOutput.connection(withMediaType: AVMediaTypeVideo) {
        stillImageOutput.captureStillImageAsynchronously(from: videoConnection, completionHandler: { (CMSampleBuffer, Error) in
            if let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(CMSampleBuffer) {
                if let cameraImage = UIImage(data: imageData) {
                    self.flippedImage = UIImage(cgImage: cameraImage.cgImage!, scale: cameraImage.scale, orientation: UIImageOrientation.rightMirrored)
                    UIImageWriteToSavedPhotosAlbum(self.flippedImage, nil, nil, nil)

                }
            }
        })
    }

}


推荐答案

我将通过发布您需要的所有代码来轻松实现AVFoundation中的录像机。如果您只是按原样复制并粘贴它,此代码应该可以使用。您唯一需要记住的是,您需要将camPreview插座连接到StoryBoard中ViewController中的UIView。这个UIView应该占用屏幕的全部内容。我将跟进代码的解释,以便您可以自己调查和修改录像机以满足您的应用程序的需求。您还需要确保将相关的隐私权限附加到info.plist,这些权限是隐私 - 麦克风使用说明和隐私 - 相机使用说明,否则您只会看到黑屏。

I am going to make it easy for you by posting the entire code you need to make a video recorder in AVFoundation. This code should work if you simply copy and paste it as is. The only thing you need to remember is that you need to connect the camPreview outlet to a UIView in the ViewController in StoryBoard. This UIView should take up the entire contents of the screen. I will follow up with an explanation of the code so you can do your own investigating and modify the video recorder to fit your app's needs. You will also need to make sure you attach the relevant privacy permissions to info.plist which are Privacy - Microphone Usage Description and Privacy - Camera Usage Description or else you will only see a black screen.

注意:在底部,我已经添加了如何以播放录制的视频标题播放录制的视频。

NOTE: Right at the bottom, I've added how to play the recorded video under the title "Playing the Recorded Video".

编辑 - 我忘记了两件让它在录制期间崩溃的东西,但我现在已经添加了它们。

EDIT - I forgot two things which made it crash during recording but I have added them now.

import UIKit
import AVFoundation

class ViewController: UIViewController, AVCaptureFileOutputRecordingDelegate {

@IBOutlet weak var camPreview: UIView!

let cameraButton = UIView()

let captureSession = AVCaptureSession()

let movieOutput = AVCaptureMovieFileOutput()

var previewLayer: AVCaptureVideoPreviewLayer!

var activeInput: AVCaptureDeviceInput!

var outputURL: URL!

override func viewDidLoad() {
    super.viewDidLoad()

    if setupSession() {
        setupPreview()
        startSession()
    }

    cameraButton.isUserInteractionEnabled = true

    let cameraButtonRecognizer = UITapGestureRecognizer(target: self, action: #selector(ViewController.startCapture))

    cameraButton.addGestureRecognizer(cameraButtonRecognizer)

    cameraButton.frame = CGRect(x: 0, y: 0, width: 100, height: 100)

    cameraButton.backgroundColor = UIColor.red

    camPreview.addSubview(cameraButton)

}

func setupPreview() {
    // Configure previewLayer
    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
    previewLayer.frame = camPreview.bounds
    previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
    camPreview.layer.addSublayer(previewLayer)
}

//MARK:- Setup Camera

func setupSession() -> Bool {

    captureSession.sessionPreset = AVCaptureSessionPresetHigh

    // Setup Camera
    let camera = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)

    do {
        let input = try AVCaptureDeviceInput(device: camera)
        if captureSession.canAddInput(input) {
            captureSession.addInput(input)
            activeInput = input
        }
    } catch {
        print("Error setting device video input: \(error)")
        return false
    }

    // Setup Microphone
    let microphone = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio)

    do {
        let micInput = try AVCaptureDeviceInput(device: microphone)
        if captureSession.canAddInput(micInput) {
            captureSession.addInput(micInput)
        }
    } catch {
        print("Error setting device audio input: \(error)")
        return false
    }


    // Movie output
    if captureSession.canAddOutput(movieOutput) {
        captureSession.addOutput(movieOutput)
    }

    return true
}

func setupCaptureMode(_ mode: Int) {
        // Video Mode

}

//MARK:- Camera Session
func startSession() {


    if !captureSession.isRunning {
        videoQueue().async {
            self.captureSession.startRunning()
        }
    }
}

func stopSession() {
    if captureSession.isRunning {
        videoQueue().async {
            self.captureSession.stopRunning()
        }
    }
}

func videoQueue() -> DispatchQueue {
    return DispatchQueue.main
}



func currentVideoOrientation() -> AVCaptureVideoOrientation {
    var orientation: AVCaptureVideoOrientation

    switch UIDevice.current.orientation {
    case .portrait:
        orientation = AVCaptureVideoOrientation.portrait
    case .landscapeRight:
        orientation = AVCaptureVideoOrientation.landscapeLeft
    case .portraitUpsideDown:
        orientation = AVCaptureVideoOrientation.portraitUpsideDown
    default:
        orientation = AVCaptureVideoOrientation.landscapeRight
    }

    return orientation
}

func startCapture() {

    startRecording()

}

//EDIT 1: I FORGOT THIS AT FIRST

func tempURL() -> URL? {
    let directory = NSTemporaryDirectory() as NSString

    if directory != "" {
        let path = directory.appendingPathComponent(NSUUID().uuidString + ".mp4")
        return URL(fileURLWithPath: path)
    }

    return nil
}


func startRecording() {

    if movieOutput.isRecording == false {

        let connection = movieOutput.connection(withMediaType: AVMediaTypeVideo)
        if (connection?.isVideoOrientationSupported)! {
            connection?.videoOrientation = currentVideoOrientation()
        }

        if (connection?.isVideoStabilizationSupported)! {
            connection?.preferredVideoStabilizationMode = AVCaptureVideoStabilizationMode.auto
        }

        let device = activeInput.device
        if (device?.isSmoothAutoFocusSupported)! {
            do {
                try device?.lockForConfiguration()
                device?.isSmoothAutoFocusEnabled = false
                device?.unlockForConfiguration()
            } catch {
                print("Error setting configuration: \(error)")
            }

        }

        //EDIT2: And I forgot this
        outputURL = tempURL()
        movieOutput.startRecording(toOutputFileURL: outputURL, recordingDelegate: self)

    }
    else {
        stopRecording()
    }

}

func stopRecording() {

    if movieOutput.isRecording == true {
        movieOutput.stopRecording()
    }
}

func capture(_ captureOutput: AVCaptureFileOutput!, didStartRecordingToOutputFileAt fileURL: URL!, fromConnections connections: [Any]!) {

}

func capture(_ captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAt outputFileURL: URL!, fromConnections connections: [Any]!, error: Error!) {
    if (error != nil) {
        print("Error recording movie: \(error!.localizedDescription)")
    } else {

        _ = outputURL as URL

    }
    outputURL = nil
}



}

这是你应该如何设置你的查看控制器

您的Info.plist权限

设置录制代表

您需要符合AVCaptureFileOutputRecordingDelegate。根据Apple文档,它为AVCaptureFileOutput的代理定义了一个接口,用于响应记录单个文件的过程中发生的事件。它需要实现两种方法,这些是代码底部的最后两种方法。第一个是,

You need to conform to AVCaptureFileOutputRecordingDelegate. According to Apple docs, it defines an interface for delegates of AVCaptureFileOutput to respond to events that occur in the process of recording a single file. It comes with two methods you need to implement and these are the last two methods at the bottom of the code. The first is,

func capture(_ captureOutput: AVCaptureFileOutput!, didStartRecordingToOutputFileAt fileURL: URL!, fromConnections connections: [Any]!) {
}

您可以在视频开始录制时为此添加任何逻辑。在我给出的代码示例中,当您点击左上角的红色方块按钮时,视频开始录制。第二个是,

You can add any logic to this when the video starts recording. In the code example I have given, the video starts recording when you tap the red square button in the left hand corner. The second is,

func capture(_ captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAt outputFileURL: URL!, fromConnections connections: [Any]!, error: Error!) {
    if (error != nil) {
        print("Error recording movie: \(error!.localizedDescription)")
    } else {

        _ = outputURL as URL

    }
    outputURL = nil
}

视频完成录制后调用此方法。在代码示例中,我再次点击红色方块按钮后,视频停止录制。当视频停止录制时,您将获得输出文件URL。这代表了您的视频。您可以使用它来转换到另一个View Controller以在AVPlayer中播放视频。或者你可以保存它。在这个例子中,你会注意到我对输出的URL没有做太多。

This is called when the video has finished recording. In the code example I have given the video stops recording after you tap the red square button a second time. When the video has stopped recording, you get an output file URL. This represents your video. You can use this to perhaps segue to another View Controller to play the video in a AVPlayer. Or you can save it. In this example you will notice I have not done much with the output URL though.

要开始录制视频,我使用了一个以编程方式创建的按钮,显示为红色方块在左上角并响应UITapGesture。您可以在应用中制作更好的按钮。

To start recording a video I have used a programmatically created button which appears as a red square in the left hand corner and responds to a UITapGesture. You can make a better button in your app.

设置会话

录像机需要一个捕获会话在setupSession()中设置。在这里添加AVCapture输入设备,包括摄像头和麦克风。根据Apple的说法,AVCaptureDeviceInput是AVCaptureInput的具体子类,用于从AVCaptureDevice对象捕获数据。但是,用户需要授予您访问权限,以便在info.plist中添加隐私 - 麦克风使用说明和隐私 - 相机使用说明,并说明您要使用录像机和麦克风的原因。如果你不这样做,你只会得到一个黑屏。会话预设是一个常数值,表示输出的质量等级或比特率。我已将此设置为高,但您可以探索其他选项。 movieOutput的类型为AVCaptureMovieFileOutput,根据Apple,它是AVCaptureFileOutput的具体子类,用于将数据捕获到QuickTime影片。这实际上是允许您录制和保存视频的。

The video recorder needs a capture session which I have setup in setupSession(). Here you add the AVCapture input devices which include the camera and the microphone. According to Apple, AVCaptureDeviceInput is a concrete sub-class of AVCaptureInput you use to capture data from an AVCaptureDevice object. However, the user needs to grant you access to use these so in you info.plist you should add Privacy - Microphone Usage Description and Privacy - Camera Usage Description and give a reason why you want to use the the video recorder and microphone. If you do not do this, you will only get a black screen. The session preset is a constant value indicating the quality level or bitrate of the output. I have set this to high but there are other options you can explore. The movieOutput is of type AVCaptureMovieFileOutput which according to Apple, is a concrete sub-class of AVCaptureFileOutput you use to capture data to a QuickTime movie. This is what actually allows you to record and save the video.

设置预览

这是您在setupPreview()中设置相机预览图层的位置。使用以下AVCaptureVideoPreviewLayer(session:captureSession)创建的捕获会话设置预览图层。

This is where you setup the camera preview layer which is done in setupPreview(). You setup the preview layer with the capture session you have created with the following AVCaptureVideoPreviewLayer(session: captureSession).

开始会话

最后一步是启动会话在startSession()中完成。您检查会话是否已在运行,如果不是,则启动会话。

The final step is to start the session which is done in startSession(). You check if a session is already running and if it is not then you start one.

if !captureSession.isRunning {
    videoQueue().async {
        self.captureSession.startRunning()
    }
}

开始录制

点击红色按钮时,会调用startRecording()方法。在这里,我添加了处理视频定位和视频稳定的方法。最后,我们再次看到我们在会话中设置的movieOutput变量。我们称之为将我们的电影录制到outputURL并告诉它我们处理录制开始和结束的委托方法在同一个视图控制器中(最后两种方法)。

When you tap the red button, the startRecording() method is called. Here I have added methods to handle video orientation and video stabilization. Finally, we see the movieOutput variable again which we setup earlier with our session. We call it to record our movie to outputURL and tell it our delegate methods to handle the start and end of recording are in the same view controller (those last two methods).

停止录制

当您点按红色按钮时会发生这种情况再次,再次调用startRecoding,但它会注意到正在记录某些内容并调用stopRecording。

It just so happens that when you tap the red button again, startRecoding is called again but it will notice that some thing is being recorded and call stopRecording.

播放录制的视频

今天我很慷慨,所以我也会把它扔掉。

I'm being generous today so I'll throw this in too.

创建一个新的视图控制器并将其命名为VideoPlayback 。使用Storyboard中的segue将其与第一个ViewController连接。给segue一个标识符showVideo。创建一个UIView并填满VideoPlayback的屏幕并为其视图控制器创建一个名为videoView的插座。将以下代码添加到新的VideoPlayback视图控制器:

Create a new view controller and call it VideoPlayback. Connect it with your first ViewController using a segue in Storyboard. Give the segue an identifier of "showVideo". Create a UIView and fills up the VideoPlayback's screen and create an outlet to its view controller called videoView. Add the following code to your new VideoPlayback view controller:

import UIKit
import AVFoundation

class VideoPlayback: UIViewController {

    let avPlayer = AVPlayer()
    var avPlayerLayer: AVPlayerLayer!

    var videoURL: URL!
    //connect this to your uiview in storyboard
    @IBOutlet weak var videoView: UIView!

    override func viewDidLoad() {
        super.viewDidLoad()

        avPlayerLayer = AVPlayerLayer(player: avPlayer)
        avPlayerLayer.frame = view.bounds
        avPlayerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
        videoView.layer.insertSublayer(avPlayerLayer, at: 0)

        view.layoutIfNeeded()

        let playerItem = AVPlayerItem(url: videoURL as URL)
        avPlayer.replaceCurrentItem(with: playerItem)

        avPlayer.play()
    }
}

现在返回上一个委托方法并按如下方式修改:

Now go back to your last delegate method and modify it as follows:

func capture(_ captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAt outputFileURL: URL!, fromConnections connections: [Any]!, error: Error!) {

    if (error != nil) {
        print("Error recording movie: \(error!.localizedDescription)")
    } else {

        let videoRecorded = outputURL! as URL

        performSegue(withIdentifier: "showVideo", sender: videoRecorded)
    }
}

最后,为segue方法创建一个准备方法,该方法将初始化将与AVPlayer一起播放的videoURL。

Finally, create a prepare for segue method that will initialize the videoURL that will play with the AVPlayer.

override func prepare(for segue: UIStoryboardSegue, sender: Any?) {

    let vc = segue.destination as! VideoPlayback
    vc.videoURL = sender as! URL
}

现在进行测试,返回并开始录制视频。在第二次点击红色方块时,将执行segue,您将看到录制的视频自动播放。

Now to test, go back and start recording a video. On the second tap of the red square, the segue will be performed and you will see the recorded video being played back automatically.

这篇关于使用AVFoundation捕获视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆