开始时AVMutableComposition视频黑色 [英] AVMutableComposition Video Black at Start

查看:70
本文介绍了开始时AVMutableComposition视频黑色的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用 AVMutableComposition AVAssetExportSession 修剪视频.随机地,我的意思是随机地(我不能始终如一地再现)用户的视频在修剪的视频的开头有一些黑框.音频不受影响.我可以100%确认要修剪的视频与之无关,因为来自所有不同来源的各种各样的视频都会发生这种情况.

I'm using AVMutableComposition and AVAssetExportSession to trim a video down. Randomly, and I mean randomly (I cannot consistently reproduce) users' videos have a few black frames at the start of the trimmed video. The audio is unaffected. I can confirm 100% that the videos being trimmed don't have anything to do with it, as this happens for a wide variety of videos from all different sources.

任何对为什么这些视频一开始都是用黑框导出的见解都将非常受欢迎.谢谢!

Any insight into why these videos are being exported with black frames in the start would be very very welcome. Thanks!

一些相关代码(对不起,长度):

Some relevant code (sorry for the length):

// AVURLAssetPreferPreciseDurationAndTimingKey added in attempt to solve issue
let videoAsset = AVURLAsset(URL: url, options: [AVURLAssetPreferPreciseDurationAndTimingKey: true])
var mixComposition = AVMutableComposition()

let compositionVideoTrack = mixComposition.addMutableTrackWithMediaType(
    AVMediaTypeVideo,
    preferredTrackID: Int32(kCMPersistentTrackID_Invalid)
)
let clipVideoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0] as! AVAssetTrack
let videoSize = clipVideoTrack.naturalSize
// startTime and duration are NSTimeInterval types
let start = startTime == 0 ? kCMTimeZero : CMTimeMakeWithSeconds(startTime, videoAsset.duration.timescale)
var dur = CMTimeMakeWithSeconds(duration, videoAsset.duration.timescale)
if dur.value >= videoAsset.duration.value {

    dur = videoAsset.duration

}
compositionVideoTrack.insertTimeRange(
    CMTimeRange(start: start, duration: dur),
    ofTrack:clipVideoTrack,
    atTime: kCMTimeZero,
    error:nil
)

compositionVideoTrack.preferredTransform = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0].preferredTransform

let compositionAudioTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
let clipAudioTrack = videoAsset.tracksWithMediaType(AVMediaTypeAudio)[0] as! AVAssetTrack
compositionAudioTrack.insertTimeRange(
    CMTimeRange(start: start, duration: dur),
    ofTrack: clipAudioTrack,
    atTime: kCMTimeZero,
    error: nil
)

let parentLayer = CALayer()
parentLayer.backgroundColor = UIColor.blackColor().CGColor
let videoLayer = CALayer()
videoLayer.backgroundColor = UIColor.blackColor().CGColor
var parentFrame = CGRect(
    x: 0,
    y: 0,
    width: videoSize.width,
    height: videoSize.height
)
if parentFrame.width % 2 > 0 {
    parentFrame.size.width = parentFrame.size.width - 1
}
// Fix crop frame height
if parentFrame.size.height % 2 > 0 {
    parentFrame.size.height = parentFrame.size.height - 1
}
parentLayer.frame = parentFrame
videoLayer.frame = CGRect(
    x: 0,
    y: 0,
    width: videoSize.width,
    height: videoSize.height
)
parentLayer.addSublayer(videoLayer)

let videoComp = AVMutableVideoComposition()
videoComp.renderSize = parentLayer.frame.size
videoComp.frameDuration = CMTimeMake(1, Int32(clipVideoTrack.nominalFrameRate))
videoComp.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, inLayer: parentLayer)

let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRange(start: kCMTimeZero, duration: mixComposition.duration)
let videoTrack = mixComposition.tracksWithMediaType(AVMediaTypeVideo)[0] as! AVAssetTrack
let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack)

layerInstruction.setTransform(CGAffineTransformMakeScale(parentLayer.frame.size.width / videoSize.width, parentLayer.frame.size.height / videoSize.height), atTime: kCMTimeZero)
instruction.layerInstructions = [layerInstruction]
videoComp.instructions = [instruction]

// Export
let exportSession = AVAssetExportSession(
    asset: mixComposition,
    presetName: AVAssetExportPresetHighestQuality
)
exportSession.videoComposition = videoComp
let renderFileName = "video.mp4"
let renderURL = NSURL(fileURLWithPath: NSTemporaryDirectory().stringByAppendingPathComponent(renderFileName))
exportSession.outputURL = renderURL
exportSession.outputFileType = AVFileTypeQuickTimeMovie
exportSession.exportAsynchronouslyWithCompletionHandler { ... }

推荐答案

对我们来说,解决方案是不要尝试在同一操作中裁剪和修剪视频.对于发生这种情况的原因,我仍然没有任何答案,但是我们能够通过以下方式解决该问题:先修剪视频一段时间,然后在视频播放了适当的时长之后,对它进行裁剪操作.

The solution to this for us was to not try to crop and trim the video in the same operation. I still don't have an answer as to why this was happening, but we were able to resolve it by first trimming the video for time, then after we had the video with the proper duration performing the crop operation on it.

不幸的是,我相信这只是框架中的错误,但至少在我们的案例中,我们能够通过减少每次操作的次数并将操作串在一起来解决此问题.

Unfortunately I believe this is just a bug in framework, but at least in our case we were able to solve it by doing less in each operation and just stringing operations together.

这篇关于开始时AVMutableComposition视频黑色的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆