AVFoundation播放连续的视频片段 [英] AVFoundation play consecutive video fragments

查看:180
本文介绍了AVFoundation播放连续的视频片段的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在开发一款iOS应用,它涉及从网络服务器中提取属于流的一部分的视频片段,并在应用内连续播放。
经过一番研究,我决定使用AVQueuePlayer。每次我从服务器获取MP4文件并将其存储在NSData对象中时,我创建一个AVPlayerItem并将其附加到队列中。另外,我收听AVPlayerItemDidPlayToEndTimeNotification通知,我将前进到下一个项目。
每次我从电影片段前进到另一个片段时,我面临的问题是一个恼人的小延迟。我尝试在iMovie上组合片段,并且无法判断片段何时结束而另一片片段何时开始。
如何摆脱连续片段之间的小暂停/滞后?

I am working on an iOS app that involves fetching video fragments that are part of a stream from a web server and playing them consecutively inside the app. After some research, I decided to use an AVQueuePlayer. Every time I fetch an MP4 file from the server and store it in an NSData object, I create an AVPlayerItem and append it to the queue. Also, I listen to the AVPlayerItemDidPlayToEndTimeNotification notification where I advance to next item. The issue I am facing is an annoying small lag every time I advance from a movie fragment to the other. I tried combining the fragments on iMovie and it was impossible to tell when a fragment ends and the other starts. How can I get rid of the small pause/lag between consecutive fragments?

这是我的代码:

import UIKit
import MediaPlayer
import AVFoundation

class WatchStream: UIViewController, StreamManagerDelegate {

var subscriber : Subscriber!

//Model object responsible for fetching mp4 fragments
var streamManager = StreamManager()

var queue : AVQueuePlayer!


override func viewDidLoad() {
    super.viewDidLoad()

    //Set up the manager
    streamManager.streamID = subscriber.streamid
    streamManager.delegate = self

    //Register for notification once movie player finished
    NSNotificationCenter.defaultCenter().addObserver(self, selector: "AVPlayerFinishedPlaying:", name:AVPlayerItemDidPlayToEndTimeNotification, object: nil)

    queue = AVQueuePlayer()
    var playerLayer = AVPlayerLayer(player: queue)
    playerLayer.frame = self.view.bounds
    self.view.layer.insertSublayer(playerLayer, below: self.navigationController!.navigationBar.layer)

}

//Delegate method notifying that a new fragment is ready to be watched
func streamManagerLoadedStream(fragment: Int, manager: StreamManager) {
    var url = streamManager.fetchFragmentToPlay()
    if url == nil {return}

    var playerItem = AVPlayerItem(URL: url!)
    queue.insertItem(playerItem, afterItem: nil)
    queue.play()
}

//Method called once name:AVPlayerItemDidPlayToEndTimeNotification fires
func AVPlayerFinishedPlaying(notification : NSNotification) {
    //We need to switch to second one
    queue.advanceToNextItem()

    if queue.status == AVPlayerStatus.ReadyToPlay {
        queue.play()
    }
}

}

同样,我的问题是推进AVQueuePlayer。它导致这种滞后不可能存在。电影片段很小(每个1-2秒)并且因为作为流而应该是连续的。
我尝试使用2个AVQueuePlayers和2个AVPlayerLayers,但它没有解决问题。

Again, my issue is when advancing the AVQueuePlayer. It is causing this lag that cannot be there. The movie fragments are small (1-2sec each) and are supposed to be continuous since as a stream. I tried using 2 AVQueuePlayers and 2 AVPlayerLayers but it didn't resolve the issue.

我也尝试过使用MPMoviePlayerController并在每次播放完毕后更新其contentURL。滞后并没有消失。

I also tried using an MPMoviePlayerController and updating its contentURL everytime it finished playing. The lag didn't go away.

任何线索?

推荐答案

将AVMutableComposition与AVPlayer一起使用而不是AVQueuePlayer。
我使用Objective-C,所以我的样本不在Swift中,但Swift代码非常相似。

Use an AVMutableComposition with an AVPlayer instead of an AVQueuePlayer. I use Objective-C, so my samples are not in Swift, but the Swift code will be very similar.

基本逻辑是:


  1. 创建AVMutableComposition

  1. Create an AVMutableComposition

组成= [AVMutableComposition new];

composition = [AVMutableComposition new];

向其添加单个可变轨道

AVMutableCompositionTrack * track = [_composition addMutableTrackWithMediaType:AVMediaTypeVideo在preferredTrackID中:kCMPersistentTrackID_Invalid];

AVMutableCompositionTrack *track = [_composition addMutableTrackWithMediaType:AVMediaTypeVideo In a preferredTrackID:kCMPersistentTrackID_Invalid];

在循环中:为每个视频片段创建一个AVAssetTrack

In a loop: create an AVAssetTrack for each of your video fragments

AVURLAsset * asset = ...;

AVURLAsset* asset = ...;

AVAssetTrack * assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

CMTimeRange timeRange = assetTrack.timeRange;

CMTimeRange timeRange = assetTrack.timeRange;

将片段添加到您想要播放的确切时间

Add the fragment to your track for the exact time you want it played

[track insertTimeRange:timeRange ofTrack:assetTrack atTime:time error:& error];

[track insertTimeRange:timeRange ofTrack:assetTrack atTime:time error:&error];


  1. 播放你的作文

AVPlayerItem * playerItem = [AVPlayerItem playerItemWithAsset:composition];
myPlayer = [[AVPlayer alloc] initWithPlayerItem:playerItem];

AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:composition]; myPlayer = [[AVPlayer alloc] initWithPlayerItem:playerItem];

...

这篇关于AVFoundation播放连续的视频片段的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆