iOS AVPlayer无法播放240 fps视频 [英] iOS AVPlayer cant play 240 fps video

查看:521
本文介绍了iOS AVPlayer无法播放240 fps视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在更改AVCaptureDeviceFormat后录制了240 fps的视频。如果我将该视频保存在照片库中,则会产生缓慢效果。但是,如果我从文档目录播放该文件,使用AVPlayer,我无法看到slowmo效果。

I recorded a 240 fps video after changing the AVCaptureDeviceFormat. If I save that video in the photo library, the slowmo effect is there. But, If I play that file from documents directory, using an AVPlayer, I cant see the slowmo effect.

播放视频的代码:

    AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:[AVAsset assetWithURL:[NSURL fileURLWithPath:fullPath]]];

     AVPlayer *feedVideoPlayer = [AVPlayer playerWithPlayerItem:playerItem];

    AVPlayerViewController *playerController = [[AVPlayerViewController alloc] init];

 playerController.view.frame = CGRectMake(0, 0, videoPreviewView.frame.size.width, videoPreviewView.frame.size.height);

 playerController.player = feedVideoPlayer;


推荐答案

这有点烦人,但我相信你会如果您不想丢失质量,则需要在 AVComposition 中重新创建视频。我想知道是否还有其他办法,但这就是我想出来的。您可以通过AVAssetExportSession从技术上导出视频,但使用PassThrough质量将产生相同的视频文件,这不会是慢动作 - 您需要对其进行转码,这会损失质量(AFAIK。请参阅在AVPlayer中播放慢速AVAsset 以获得该解决方案。)

It's a bit annoying, but I believe you'll need to re-create the video in an AVComposition if you don't want to lose quality. I'd love to know if there is another way, but this is what I've come up with. You can technically export the video via AVAssetExportSession, but using a PassThrough quality will result in the same video file, which won't be slow motion- you'll need to transcode it, which loses quality (AFAIK. See Issue playing slow-mo AVAsset in AVPlayer for that solution).

您需要做的第一件事就是获取源媒体的原始时间映射对象。你可以这样做:

The first thing you'll need to do is grab the source media's original time mapping objects. You can do that like so:

let options = PHVideoRequestOptions()
options.version = PHVideoRequestOptionsVersion.current
options.deliveryMode = .highQualityFormat

PHImageManager().requestAVAsset(forVideo: phAsset, options: options, resultHandler: { (avAsset, mix, info) in

    guard let avAsset = avAsset else { return }

    let originalTimeMaps = avAsset.tracks(withMediaType: AVMediaTypeVideo)
        .first?
        .segments
        .flatMap { $0.timeMapping } ?? []

}

一旦你有了原版的时间贴图媒体(位于文档目录中的媒体),您可以传入该媒体的URL和您想要重新创建的原始CMTimeMapping对象。然后创建一个可以在AVPlayer中播放的新AVComposition。您需要类似于这样的类:

Once you have timeMappings of the original media (the one sitting in your documents directory), you can pass in the URL of that media and the original CMTimeMapping objects that you would like to recreate. Then create a new AVComposition that is ready to play in an AVPlayer. You'll need a class similar to this:

class CompositionMapper {

let url: URL
let timeMappings: [CMTimeMapping]

init(for url: URL, with timeMappings: [CMTimeMapping]) {
    self.url = url
    self.timeMappings = timeMappings
}

init(with asset: AVAsset, and timeMappings: [CMTimeMapping]) {
    guard let asset = asset as? AVURLAsset else {
        print("cannot get a base URL from this asset.")
        fatalError()
    }

    self.timeMappings = timeMappings
    self.url = asset.url
}

func compose() -> AVComposition {
    let composition = AVMutableComposition(urlAssetInitializationOptions: [AVURLAssetPreferPreciseDurationAndTimingKey: true])

    let emptyTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
    let audioTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)

    let asset = AVAsset(url: url)
    guard let videoAssetTrack = asset.tracks(withMediaType: AVMediaTypeVideo).first else { return composition }

    var segments: [AVCompositionTrackSegment] = []
    for map in timeMappings {

        let segment = AVCompositionTrackSegment(url: url, trackID: kCMPersistentTrackID_Invalid, sourceTimeRange: map.source, targetTimeRange: map.target)
        segments.append(segment)
    }

    emptyTrack.preferredTransform = videoAssetTrack.preferredTransform
    emptyTrack.segments = segments

    if let _ = asset.tracks(withMediaType: AVMediaTypeVideo).first {
        audioTrack.segments = segments
    }

    return composition.copy() as! AVComposition
}

然后你可以使用 compose()你的 CompositionMapper 类的功能,为你提供一个 AVComposition ,可以在 AVPlayer ,它应该尊重您传入的 CMTimeMapping 对象。

You can then use the compose() function of your CompositionMapper class to give you an AVComposition that is ready to play in an AVPlayer, which should respect the CMTimeMapping objects that you've passed in.

let compositionMapper = CompositionMapper(for: someAVAssetURL, with: originalTimeMaps)
let mappedComposition = compositionMapper.compose()

let playerItem = AVPlayerItem(asset: mappedComposition)
let player = AVPlayer(playerItem: playerItem)
playerItem.audioTimePitchAlgorithm = AVAudioTimePitchAlgorithmVarispeed

如果您需要帮助将其转换为Objective-C,请告诉我,但它应该相对简单。

Let me know if you need help converting this to Objective-C, but it should be relatively straight forward.

这篇关于iOS AVPlayer无法播放240 fps视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆