iOS 10.0 - 10.1:使用AVVideoCompositionCoreAnimationTool后,AVPlayerLayer不显示视频 [英] iOS 10.0 - 10.1: AVPlayerLayer doesn't show video after using AVVideoCompositionCoreAnimationTool, only audio

查看:797
本文介绍了iOS 10.0 - 10.1:使用AVVideoCompositionCoreAnimationTool后,AVPlayerLayer不显示视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如果你想自己运行这个项目,这是一个完整的项目: https://www.dropbox.com/s/5p384mogjzflvqk/AVPlayerLayerSoundOnlyBug_iOS10.zip?dl=0

Here is a complete project if you care to run this yourself: https://www.dropbox.com/s/5p384mogjzflvqk/AVPlayerLayerSoundOnlyBug_iOS10.zip?dl=0

这是iOS 10上的一个新问题,从iOS 10.2开始修复它。在导出期间使用AVAssetExportSession和AVVideoCompositionCoreAnimationTool导出视频以在视频顶部合成图层后,AVPlayerLayer中播放的视频无法播放。这似乎不是由于达到AV编码/解码流水线限制造成的,因为它经常在单次导出后发生,据我所知只会旋转2个流水线:1为AVAssetExportSession而另一个为AVPlayer。我也正确地设置了图层的框架,正如您可以通过运行下面的代码看到的那样,该图层为图层提供了一个您可以清楚看到的蓝色背景。

This is a new problem on iOS 10, and it has been fixed as of iOS 10.2. After exporting a video using AVAssetExportSession and AVVideoCompositionCoreAnimationTool to composite a layer on top of the video during export, videos played in AVPlayerLayer fail to play. This doesn't seem to be caused by hitting the AV encode/decode pipeline limit because it often happens after a single export, which as far as I know only spins up 2 pipelines: 1 for the AVAssetExportSession and another for the AVPlayer. I am also setting the layer's frame properly, as you can see by running the code below which gives the layer a blue background you can plainly see.

导出后,等待在播放视频之前的一段时间似乎使其更加可靠,但这并不是一个可以接受的解决方法来告诉您的用户。

After an export, waiting for some time before playing a video seems to make it far more reliable but that's not really an acceptable workaround to tell your users.

有关导致此问题的原因或方法可以解决或解决它吗?我搞砸了什么或者错过了一个重要的步骤或细节?非常感谢任何帮助或文档指示。

Any ideas on what's causing this or how I can fix or work around it? Have I messed something up or missing an important step or detail? Any help or pointers to documentation are much appreciated.

import UIKit
import AVFoundation

/* After exporting an AVAsset using AVAssetExportSession with AVVideoCompositionCoreAnimationTool, we
 * will attempt to play a video using an AVPlayerLayer with a blue background.
 *
 * If you see the blue background and hear audio you're experiencing the missing-video bug. Otherwise
 * try hitting the button again.
 */

class ViewController: UIViewController {
    private var playerLayer: AVPlayerLayer?
    private let button = UIButton()
    private let indicator = UIActivityIndicatorView(activityIndicatorStyle: .gray)

    override func viewDidLoad() {
        super.viewDidLoad()
        view.backgroundColor = UIColor.white
        button.setTitle("Cause Trouble", for: .normal)
        button.setTitleColor(UIColor.black, for: .normal)
        button.addTarget(self, action: #selector(ViewController.buttonTapped), for: .touchUpInside)
        view.addSubview(button)
        button.translatesAutoresizingMaskIntoConstraints = false
        NSLayoutConstraint.activate([
            button.centerXAnchor.constraint(equalTo: view.centerXAnchor),
            button.bottomAnchor.constraint(equalTo: view.bottomAnchor, constant: -16),
        ])

        indicator.hidesWhenStopped = true
        view.insertSubview(indicator, belowSubview: button)
        indicator.translatesAutoresizingMaskIntoConstraints = false
        NSLayoutConstraint.activate([
            indicator.centerXAnchor.constraint(equalTo: button.centerXAnchor),
            indicator.centerYAnchor.constraint(equalTo: button.centerYAnchor),
        ])
    }

    func buttonTapped() {
        button.isHidden = true
        indicator.startAnimating()
        playerLayer?.removeFromSuperlayer()

        let sourcePath = Bundle.main.path(forResource: "video.mov", ofType: nil)!
        let sourceURL = URL(fileURLWithPath: sourcePath)
        let sourceAsset = AVURLAsset(url: sourceURL)

        //////////////////////////////////////////////////////////////////////
        // STEP 1: Export a video using AVVideoCompositionCoreAnimationTool //
        //////////////////////////////////////////////////////////////////////
        let exportSession = { () -> AVAssetExportSession in
            let sourceTrack = sourceAsset.tracks(withMediaType: AVMediaTypeVideo).first!

            let parentLayer = CALayer()
            parentLayer.frame = CGRect(origin: .zero, size: CGSize(width: 1280, height: 720))
            let videoLayer = CALayer()
            videoLayer.frame = parentLayer.bounds
            parentLayer.addSublayer(videoLayer)

            let composition = AVMutableVideoComposition(propertiesOf: sourceAsset)
            composition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, in: parentLayer)
            let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: sourceTrack)
            layerInstruction.setTransform(sourceTrack.preferredTransform, at: kCMTimeZero)
            let instruction = AVMutableVideoCompositionInstruction()
            instruction.timeRange = CMTimeRange(start: kCMTimeZero, duration: sourceAsset.duration)
            instruction.layerInstructions = [layerInstruction]
            composition.instructions = [instruction]

            let e = AVAssetExportSession(asset: sourceAsset, presetName: AVAssetExportPreset1280x720)!
            e.videoComposition = composition
            e.outputFileType = AVFileTypeQuickTimeMovie
            e.timeRange = CMTimeRange(start: kCMTimeZero, duration: sourceAsset.duration)
            let outputURL = URL(fileURLWithPath: NSTemporaryDirectory().appending("/out2.mov"))
            _ = try? FileManager.default.removeItem(at: outputURL)
            e.outputURL = outputURL
            return e
        }()

        print("Exporting asset...")
        exportSession.exportAsynchronously {
            assert(exportSession.status == .completed)

            //////////////////////////////////////////////
            // STEP 2: Play a video in an AVPlayerLayer //
            //////////////////////////////////////////////
            DispatchQueue.main.async {
                // Reuse player layer, shouldn't be hitting the AV pipeline limit
                let playerItem = AVPlayerItem(asset: sourceAsset)
                let layer = self.playerLayer ?? AVPlayerLayer()
                if layer.player == nil {
                    layer.player = AVPlayer(playerItem: playerItem)
                }
                else {
                    layer.player?.replaceCurrentItem(with: playerItem)
                }
                layer.backgroundColor = UIColor.blue.cgColor
                if UIDeviceOrientationIsPortrait(UIDevice.current.orientation) {
                    layer.frame = self.view.bounds
                    layer.bounds.size.height = layer.bounds.width * 9.0 / 16.0
                }
                else {
                    layer.frame = self.view.bounds.insetBy(dx: 0, dy: 60)
                    layer.bounds.size.width = layer.bounds.height * 16.0 / 9.0
                }
                self.view.layer.insertSublayer(layer, at: 0)
                self.playerLayer = layer

                layer.player?.play()
                print("Playing a video in an AVPlayerLayer...")

                self.button.isHidden = false
                self.indicator.stopAnimating()
            }
        }
    }
}


推荐答案

答案对于我来说,在这种情况下,通过使用实现 AVVideoCompositing 协议的自定义视频合成类来解决 AVVideoCompositionCoreAnimationTool 的问题,以及实现 AVVideoCompositionInstruction 协议的自定义组合指令。因为我需要在视频之上覆盖 CALayer ,我在合成指令实例中包含该图层。

The answer for me in this case is to work around the issue with AVVideoCompositionCoreAnimationTool by using a custom video compositing class implementing the AVVideoCompositing protocol, and a custom composition instruction implementing the AVVideoCompositionInstruction protocol. Because I need to overlay a CALayer on top of the video I'm including that layer in the composition instruction instance.

您需要在视频合成中设置自定义合成器,如下所示:

You need to set the custom compositor on your video composition like so:

composition.customVideoCompositorClass = CustomVideoCompositor.self

然后在其上设置自定义说明:

and then set your custom instructions on it:

let instruction = CustomVideoCompositionInstruction(...) // whatever parameters you need and are required by the instruction protocol
composition.instructions = [instruction]

编辑:以下是如何使用自定义合成器使用GPU覆盖视频图层的工作示例:< a href =https://github.com/samsonjs/LayerVideoCompositor =nofollow noreferrer> https://github.com/samsonjs/LayerVideoCompositor ...原始答案在下面继续

Here is a working example of how to use a custom compositor to overlay a layer on a video using the GPU: https://github.com/samsonjs/LayerVideoCompositor ... original answer continues below

至于合成器本身,你可以实现如果您观看相关的WWDC会话并查看其示例代码,请参阅。我不能发布我在这里写的那个,但我正在使用CoreImage来处理 AVAsynchronousVideoCompositionRequest ,确保使用OpenGL CoreImage上下文以获得最佳性能(如果你在CPU上做它会非常慢。)如果在导出过程中出现内存使用量峰值,您可能还需要一个自动释放池。

As for the compositor itself you can implement one if you watch the relevant WWDC sessions and check out their sample code. I cannot post the one I wrote here, but I am using CoreImage to do the heavy lifting in processing the AVAsynchronousVideoCompositionRequest, making sure to use an OpenGL CoreImage context for best performance (if you do it on the CPU it will be abysmally slow). You also may need an auto-release pool if you get a memory usage spike during the export.

如果您正在覆盖 CALayer 和我一样,确保在将该图层渲染为 CGImage layer.isGeometryFlipped = true c>在发送给CoreImage之前。并确保在合成器中逐帧缓存渲染的 CGImage

If you're overlaying a CALayer like me then make sure to set layer.isGeometryFlipped = true when you render that layer out to a CGImage before sending it off to CoreImage. And make sure you cache the rendered CGImage from frame to frame in your compositor.

这篇关于iOS 10.0 - 10.1:使用AVVideoCompositionCoreAnimationTool后,AVPlayerLayer不显示视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆