iOS:AVPlayer - 获取视频当前帧的快照 [英] iOS: AVPlayer - getting a snapshot of the current frame of a video
问题描述
我花了一整天时间,经历了很多SO答案,Apple引用,文档等,但没有成功。
I have spent the whole day and went through a lot of SO answers, Apple references, documentations, etc, but no success.
我想要一个简单的事情: 我正在使用AVPlayer播放视频,我想暂停它并将当前帧设为 UIImage
。就是这样。
I want a simple thing: I am playing a video using AVPlayer and I want to pause it and get the current frame as UIImage
. That's it.
我的视频是位于互联网上的m3u8文件,它通常在 AVPlayerLayer
中播放任何问题。
My video is a m3u8 file located on the internet, it is played normally in the AVPlayerLayer
without any problems.
我尝试了什么:
-
AVAssetImageGenerator
。它无效,方法copyCGImageAtTime:actualTime:error:
返回null image ref。根据答案此处AVAssetImageGenerator
不适用于流媒体视频。 - 拍摄播放器视图的快照。我在
AVPlayerLayer上首先尝试了
,但后来我意识到它没有呈现这种特殊层。然后我发现在iOS 7中引入了一种新方法 -renderInContext:
drawViewHierarchyInRect:afterScreenUpdates:
它应该能够渲染特殊图层,但没有运气,仍然得到了空白的UI快照显示视频的黑色区域。 -
AVPlayerItemVideoOutput
。我为<我添加了视频输出AVPlayerItem
,但每当我调用hasNewPixelBufferForItemTime:
时,它返回否
。我想问题是再次流式传输视频和我并不孤单这个问题。 -
AVAssetReader
。我正在考虑尝试但是在找到相关问题后决定不浪费时间这里。
AVAssetImageGenerator
. It is not working, the methodcopyCGImageAtTime:actualTime: error:
returns null image ref. According to the answer hereAVAssetImageGenerator
doesn't work for streaming videos.- Taking snapshot of the player view. I tried first
renderInContext:
onAVPlayerLayer
, but then I realized that it is not rendering this kind of "special" layers. Then I found a new method introduced in iOS 7 -drawViewHierarchyInRect:afterScreenUpdates:
which should be able to render also the special layers, but no luck, still got the UI snapshot with blank black area where the video is shown. AVPlayerItemVideoOutput
. I have added a video output for myAVPlayerItem
, however whenever I callhasNewPixelBufferForItemTime:
it returnsNO
. I guess the problem is again streaming video and I am not alone with this problem.AVAssetReader
. I was thinking to try it but decided not to lose time after finding a related question here.
所以有没有什么方法可以获得某些内容的快照我现在在屏幕上看到了吗?我不敢相信。
So isn't there any way to get a snapshot of something that I am anyway seeing right now on the screen? I can't believe this.
推荐答案
AVPlayerItemVideoOutput
对我来说很好来自m3u8。也许是因为我不咨询 hasNewPixelBufferForItemTime
并只是调用 copyPixelBufferForItemTime
?此代码生成 CVPixelBuffer
而不是 UIImage
,但有答案描述如何做到这一点。
AVPlayerItemVideoOutput
works fine for me from an m3u8. Maybe it's because I don't consult hasNewPixelBufferForItemTime
and simply call copyPixelBufferForItemTime
? This code produces a CVPixelBuffer
instead of a UIImage
, but there are answers that describe how to do that.
这个答案主要来自这里
#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>
@interface ViewController ()
@property (nonatomic) AVPlayer *player;
@property (nonatomic) AVPlayerItem *playerItem;
@property (nonatomic) AVPlayerItemVideoOutput *playerOutput;
@end
@implementation ViewController
- (void)setupPlayerWithLoadedAsset:(AVAsset *)asset {
NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
self.playerOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
[self.playerItem addOutput:self.playerOutput];
self.player = [AVPlayer playerWithPlayerItem:self.playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
playerLayer.frame = self.view.frame;
[self.view.layer addSublayer:playerLayer];
[self.player play];
}
- (IBAction)grabFrame {
CVPixelBufferRef buffer = [self.playerOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
NSLog(@"The image: %@", buffer);
}
- (void)viewDidLoad {
[super viewDidLoad];
NSURL *someUrl = [NSURL URLWithString:@"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:^{
NSError* error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
if (status == AVKeyValueStatusLoaded)
{
dispatch_async(dispatch_get_main_queue(), ^{
[self setupPlayerWithLoadedAsset:asset];
});
}
else
{
NSLog(@"%@ Failed to load the tracks.", self);
}
}];
}
@end
这篇关于iOS:AVPlayer - 获取视频当前帧的快照的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!