看到视频时AVPlayer在exportAsynchronouslyWithCompletionHandler创建之前长时间的延迟 [英] Long delay before seeing video when AVPlayer created in exportAsynchronouslyWithCompletionHandler

查看:1452
本文介绍了看到视频时AVPlayer在exportAsynchronouslyWithCompletionHandler创建之前长时间的延迟的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当打从导出的视频 AVAssetExportSession ,你看长视频前听到的声音。音频马上发挥,但录音循环几次(即开始和结束)后,只出现视频。换句话说,你看不到任何图像之前听到从视频中多次音频。

我们都在iOS 8中使用自动版式。

使用下面的测试中,我们分离出的问题 exportAsynchronouslyWithCompletionHandler 。在这两个code块,我们发挥现有的视频 - 而不是一个涉及出口 - 因此导出过程已被消灭作为一个变量

code 1 同时扮演视频和放大器;在开始,而音频的 code 2 只起在开始的音频和显示10-60秒的延迟后的视频(在视频循环几次)。

两个code块之间的唯一区别是一种使用 exportAsynchronouslyWithCompletionHandler 而另一个没有播放视频。

帮助?是否有可能在音频输出得到第一和准备视频前玩?东西用在不同的线程的出口情况发生呢?

  FUNC initPlayer(videoURL:NSURL){
    //创建播放器
    玩家= AVPlayer(网址:videoURL)
    让playerItem = player.currentItem
    让资产= playerItem.asset
    playerLayer = AVPlayerLayer(玩家:玩家)
    playerLayer.frame = videoView.frame
    view.layer.addSublayer(playerLayer)
    player.seekToTime(kCMTimeZero)
    player.actionAtItemEnd = .None
    player.play()    //获取当视频循环目的而进行的通知
    。NSNotificationCenter.defaultCenter()的addObserver(自我,选择:playerItemDidReachEnd:名称:AVPlayerItemDidPlayToEndTimeNotification,对象:playerItem)    //登录状态
    的println(初始化的视频播放器:\\(CMTimeGetSeconds(asset.duration))秒和放大器; \\(asset.tracks.count)可跟踪\\(videoURL))
}FUNC playExistingVideo(){
    让文件名=/ChopsticksVideo.mp4
    让allPaths = NSSearchPathForDirectoriesInDomains(.DocumentDirectory,.UserDomainMask,真)
    让docsPath = allPaths [0]作为!的NSString
    让exportPath = docsPath.stringByAppendingFormat(文件名)
    让exportURL = NSURL.fileURLWithPath(exportPath为String)!    initPlayer(exportURL)
}

code 1:

  //创建出口
    让出口= AVAssetExportSession(资产:mainComposition,presetName:AVAssetExport presetHighestQuality)
    exporter.videoComposition = videoComposition
    exporter.outputFileType = AVFileTypeMPEG4
    exporter.outputURL = exportURL
    exporter.shouldOptimizeForNetworkUse =真    playExistingVideo()

code 2:

  //创建出口
    让出口= AVAssetExportSession(资产:mainComposition,presetName:AVAssetExport presetHighestQuality)
    exporter.videoComposition = videoComposition
    exporter.outputFileType = AVFileTypeMPEG4
    exporter.outputURL = exportURL
    exporter.shouldOptimizeForNetworkUse =真    // - 导出视频
    exporter.exportAsynchronouslyWithCompletionHandler({
        self.playExistingVideo()
    })


解决方案

我要表明,问题就在这里:

  //创建播放器
    玩家= AVPlayer(网址:videoURL)
    让playerItem = player.currentItem
    让资产= playerItem.asset
    playerLayer = AVPlayerLayer(玩家:玩家)
    playerLayer.frame = videoView.frame
    view.layer.addSublayer(playerLayer)
    player.seekToTime(kCMTimeZero)
    player.actionAtItemEnd = .None
    player.play()

您看到的,当你从一个视频网址创建AVPlayer,它进入了世界的还没有准备好打的。它通常可以开始播放音频得相当快,但视频需要更长的时间prepare。这可能会在看到任何解释的延迟。

好了,而不是等待视频做好准备,你只是走在前面,说播放()立即。这里是我的建议。我建议你​​做的是我在我的书解释(这是一个链接到实际code):创建播放器和层,但随后成立了志愿,这样将通知您,当球员准备好显示和的然后的添加图层,并开始播放。

另外,我有一个建议。在我看来,有一个危险,你正在运行code,如何设置界面(与层),并说播放()在后台线程的。那是一定会造成各种延迟。你似乎假定从 exportAsynchronouslyWithCompletionHandler完成处理程序:被称为主线程 - 和你前面直行和呼叫下一个方法等着手成立你的界面。这是一个非常危险的假设。在我的经验,你永远不应该假设的任何的AVFoundation完成处理程序是在主线程上。你应该在你完成处理程序来走出主线程与 dispatch_async ,只有从那里出发。如果你看一下code我联系你,你会看到,我小心翼翼地做到这一点。

When playing a video exported from a AVAssetExportSession, you hear audio long before seeing video. Audio plays right away, but video only appears after the recording loops several times (i.e., starts and finishes). In other words, you hear audio from the video multiple times before seeing any images.

We are using AutoLayout on iOS 8.

Using the following test, we isolated the problem to exportAsynchronouslyWithCompletionHandler. In both code blocks, we play an existing video -- not one related to the export -- so the export process has been eliminated as a variable.

Code 1 plays both video & audio at the start whereas Code 2 only plays audio at the start and shows video after a delay of 10-60 seconds (after the video loops several times).

The only difference between the two code blocks is one uses exportAsynchronouslyWithCompletionHandler to play the video while the other one does not.

Help? Is it possible the audio gets exported first and is ready to play before the video? Something to do with the export happening on a different thread?

func initPlayer(videoURL: NSURL) {
    // Create player
    player = AVPlayer(URL: videoURL)
    let playerItem = player.currentItem
    let asset = playerItem.asset
    playerLayer = AVPlayerLayer(player: player)
    playerLayer.frame = videoView.frame
    view.layer.addSublayer(playerLayer)
    player.seekToTime(kCMTimeZero)
    player.actionAtItemEnd = .None
    player.play()

    // Get notified when video done for looping purposes
    NSNotificationCenter.defaultCenter().addObserver(self, selector: "playerItemDidReachEnd:", name: AVPlayerItemDidPlayToEndTimeNotification, object: playerItem)

    // Log status
    println("Initialized video player: \(CMTimeGetSeconds(asset.duration)) seconds & \(asset.tracks.count) tracks for \(videoURL)")
}

func playExistingVideo() {
    let filename = "/ChopsticksVideo.mp4"
    let allPaths = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)
    let docsPath = allPaths[0] as! NSString
    let exportPath = docsPath.stringByAppendingFormat(filename)
    let exportURL = NSURL.fileURLWithPath(exportPath as String)!

    initPlayer(exportURL)
}

Code 1:

    // Create exporter
    let exporter = AVAssetExportSession(asset: mainComposition, presetName: AVAssetExportPresetHighestQuality)
    exporter.videoComposition = videoComposition
    exporter.outputFileType = AVFileTypeMPEG4
    exporter.outputURL = exportURL
    exporter.shouldOptimizeForNetworkUse = true

    playExistingVideo()

Code 2:

    // Create exporter
    let exporter = AVAssetExportSession(asset: mainComposition, presetName: AVAssetExportPresetHighestQuality)
    exporter.videoComposition = videoComposition
    exporter.outputFileType = AVFileTypeMPEG4
    exporter.outputURL = exportURL
    exporter.shouldOptimizeForNetworkUse = true

    // -- Export video
    exporter.exportAsynchronouslyWithCompletionHandler({
        self.playExistingVideo()
    })

解决方案

I'm going to suggest that the problem is here:

    // Create player
    player = AVPlayer(URL: videoURL)
    let playerItem = player.currentItem
    let asset = playerItem.asset
    playerLayer = AVPlayerLayer(player: player)
    playerLayer.frame = videoView.frame
    view.layer.addSublayer(playerLayer)
    player.seekToTime(kCMTimeZero)
    player.actionAtItemEnd = .None
    player.play()

You see, when you create an AVPlayer from a video URL, it comes into the world not yet ready to play. It can usually start playing audio quite quickly, but video takes longer to prepare. This could explain the delay in seeing anything.

Well, instead of waiting for the video to be ready, you are just going ahead and saying play() immediately. Here's my suggestion. What I suggest you do is what I explain in my book (that's a link to the actual code): create the player and the layer, but then set up KVO so that you are notified when the player is ready to display, and then add the layer and start playing.

Also, I have one more suggestion. It seems to me that there is a danger that you are running that code, setting up your interface (with the layer) and saying play(), on a background thread. That is certain to cause delays of various kinds. You seem to be assuming that the completion handler from exportAsynchronouslyWithCompletionHandler: is being called on the main thread - and you are going straight ahead and calling the next method and so proceeding to set up your interface. That's a very risky assumption. In my experience you should never assume that any AVFoundation completion handler is on the main thread. You should be stepping out to the main thread with dispatch_async in your completion handler and proceeding only from there. If you look at the code I linked you to, you'll see I'm careful to do that.

这篇关于看到视频时AVPlayer在exportAsynchronouslyWithCompletionHandler创建之前长时间的延迟的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆