前置摄像头AVAssetExportSession方向错误 [英] AVAssetExportSession wrong orientation in front camera

查看:88
本文介绍了前置摄像头AVAssetExportSession方向错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在前置相机中遇到使用AVAssetExportSession导出的视频方向错误。我按照本教程 https://stackoverflow.com/a/35368649/3764365 ,但我得到了这个场景。我认为将图像切成两半并没有错。我尝试更改视频图层,渲染图层但没有运气。我的代码看起来像这样。

I'm encountering wrong orientation of video exported using AVAssetExportSession only in front Camera. I followed this tutorial https://stackoverflow.com/a/35368649/3764365 but I got this scenario. I think it's not wrong orientation the image is cut at half. I tried changing the video layer, render layer but got no luck. My code looks like this.

let composition = AVMutableComposition()
        let vidAsset = AVURLAsset(url: path)

        // get video track
        let vtrack =  vidAsset.tracks(withMediaType: AVMediaTypeVideo)
        // get audi trac

        let videoTrack:AVAssetTrack = vtrack[0]
        _ = videoTrack.timeRange.duration
        let vid_timerange = CMTimeRangeMake(kCMTimeZero, vidAsset.duration)

        var _: NSError?
        let compositionvideoTrack:AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID())

        do {
            try compositionvideoTrack.insertTimeRange(vid_timerange, of: videoTrack, at: kCMTimeZero)

        } catch let error {
            print(error.localizedDescription)
        }

        let compositionVideoTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)

        let audioTrack = vidAsset.tracks(withMediaType: AVMediaTypeAudio)[0]

        do {
            try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, vidAsset.duration), of: audioTrack, at: kCMTimeZero)

        } catch {
            print("error")
        }

        let size = videoTrack.naturalSize


        let parentlayer = CALayer()

        parentlayer.frame = CGRect(x: 0, y: 0, width: size.height, height: size.width)
        let videolayer = CALayer()
        videolayer.frame = CGRect(x: 0, y: 0, width: size.height, height: size.width)
        parentlayer.addSublayer(videolayer)

let layercomposition = AVMutableVideoComposition()
        layercomposition.frameDuration = CMTimeMake(1, 30)
        layercomposition.renderSize = CGSize(width: size.height, height: size.width)

        layercomposition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videolayer, in: parentlayer)

        // instruction for watermark
        let instruction = AVMutableVideoCompositionInstruction()
        instruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration)

        let videotrack = composition.tracks(withMediaType: AVMediaTypeVideo)[0] as AVAssetTrack
        let layerinstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videotrack)

        instruction.layerInstructions = [layerinstruction]
        layercomposition.instructions = [instruction]

        layerinstruction.setTransform(videoTrack.preferredTransform, at: kCMTimeZero)

        //  create new file to receive data
        let movieDestinationUrl = UIImage.outPut()

        // use AVAssetExportSession to export video
        let assetExport = AVAssetExportSession(asset: composition, presetName: AVAssetExportPreset1280x720)!
        assetExport.videoComposition = layercomposition
        assetExport.outputFileType = AVFileTypeQuickTimeMovie
        assetExport.outputURL = movieDestinationUrl


推荐答案

我将分享我如何解决这个问题的代码。

I will share my code on how I solved this issue.

func addImagesToVideo(path: URL, labelImageViews: [LabelImageView]) {

        SVProgressHUD.show()

        let composition = AVMutableComposition()
        let vidAsset = AVURLAsset(url: path)

        // get video track
        let vtrack =  vidAsset.tracks(withMediaType: AVMediaTypeVideo)
        // get audi trac

        let videoTrack:AVAssetTrack = vtrack[0]
        _ = videoTrack.timeRange.duration
        let vid_timerange = CMTimeRangeMake(kCMTimeZero, vidAsset.duration)

        var _: NSError?
        let compositionvideoTrack:AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID())

        do {
            try compositionvideoTrack.insertTimeRange(vid_timerange, of: videoTrack, at: kCMTimeZero)

        } catch let error {
            print(error.localizedDescription)
        }

        let compositionVideoTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)

        let audioTrack = vidAsset.tracks(withMediaType: AVMediaTypeAudio)[0]

        do {
            try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, vidAsset.duration), of: audioTrack, at: kCMTimeZero)

        } catch {
            print("error")
        }

        let size = videoTrack.naturalSize


        let parentlayer = CALayer()

        parentlayer.frame = CGRect(x: 0, y: 0, width: size.height, height: size.width)
        let videolayer = CALayer()
        videolayer.frame = CGRect(x: 0, y: 0, width: size.height, height: size.width)
        parentlayer.addSublayer(videolayer)

        if labelImageViews.count != 0 {
            let blankImage = self.clearImage(size: videolayer.frame.size)
            let image = self.saveImage(imageOne: blankImage, labelImageViews: labelImageViews)

            let imglayer = CALayer()
            imglayer.contents = image.cgImage
            imglayer.frame = CGRect(origin: CGPoint.zero, size: videolayer.frame.size)
            imglayer.opacity = 1
            parentlayer.addSublayer(imglayer)
        }


        let layercomposition = AVMutableVideoComposition()
        layercomposition.frameDuration = CMTimeMake(1, 30)
        layercomposition.renderSize = CGSize(width: size.height, height: size.width)

        layercomposition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videolayer, in: parentlayer)

        // instruction for watermark
        let instruction = AVMutableVideoCompositionInstruction()
        instruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration)

        let videotrack = composition.tracks(withMediaType: AVMediaTypeVideo)[0] as AVAssetTrack
        let layerinstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videotrack)

        instruction.layerInstructions = [layerinstruction]
        layercomposition.instructions = [instruction]

        var isVideoAssetPortrait = false

        let videoTransform = videoTrack.preferredTransform

        if(videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {

            isVideoAssetPortrait = true

        }

        if(videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
            isVideoAssetPortrait = true
        }


        if isVideoAssetPortrait {
            let FirstAssetScaleFactor = CGAffineTransform(scaleX: 1, y: 1)

            layerinstruction.setTransform(videoTrack.preferredTransform.concatenating(FirstAssetScaleFactor), at: kCMTimeZero)
        } else {
            let FirstAssetScaleFactor = CGAffineTransform(scaleX: 1, y: 1)

            layerinstruction.setTransform(videoTrack.preferredTransform.concatenating(FirstAssetScaleFactor).concatenating(CGAffineTransform(translationX: 0, y: 560)), at: kCMTimeZero)
        }


        //  create new file to receive data
        let movieDestinationUrl = UIImage.outPut()

        // use AVAssetExportSession to export video
        let assetExport = AVAssetExportSession(asset: composition, presetName: AVAssetExportPreset1280x720)!
        assetExport.videoComposition = layercomposition
        assetExport.outputFileType = AVFileTypeQuickTimeMovie
        assetExport.outputURL = movieDestinationUrl

        assetExport.exportAsynchronously(completionHandler: {
            switch assetExport.status{
            case  AVAssetExportSessionStatus.failed:
                print("failed \(assetExport.error!)")
            case AVAssetExportSessionStatus.cancelled:
                print("cancelled \(assetExport.error!)")
            default:
                print("Movie complete")


                // play video
                OperationQueue.main.addOperation({ () -> Void in

                    let output = UIImage.outPut()
                    UIImage.compress(inputURL: movieDestinationUrl as NSURL, outputURL: output as NSURL) {

                        UISaveVideoAtPathToSavedPhotosAlbum(output.relativePath, nil, nil, nil)

                        print("Done Converting")

                        DispatchQueue.main.async {
                            SVProgressHUD.dismiss()
                        }
                    }

                })
            }
        })
    }

这篇关于前置摄像头AVAssetExportSession方向错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆