AVPlayer未呈现到其AVPlayerLayer [英] AVPlayer not rendering to its AVPlayerLayer
问题描述
我有一个AVPlayerLayer(CALayer的子类),我需要进入一种可以传递给QCRenderer的图像类型(QCRenderer接受NSImages和CIImages.),我可以将CALayer转换为CGImageRef,然后转换为CGImageRef.NSImage,但是内容总是很清楚.
I have a AVPlayerLayer (subclass of CALayer) and I need to get in into a image type that can be passed to a QCRenderer (QCRenderer accepts NSImages and CIImages.) I can convert the CALayer to a CGImageRef, and that to an NSImage, but the contents is always clear.
我将其范围缩小为以下两个原因之一:
I've narrowed it down to one of two reasons:
- 我没有正确创建NSImage.
- AVPlayer未呈现到AVPlayerLayer.
我没有收到任何错误,并且找到了一些有关转换CALayers的文档.另外,我将AVPlayerLayer添加到NSView,该视图仍然为空,因此我相信2是问题所在.
I am not receiving any errors, and have found some documentation on converting CALayers. Also, I added the AVPlayerLayer to an NSView, which remains empty so I believe 2 is the problem.
我正在使用Apple的AVPlayerDemo的AVPlayerDemoPlaybackViewController的修改版本.因为将所有接口代码都剥离了,所以我将其变成了NSObject.
I'm using a modified version of Apple's AVPlayerDemo's AVPlayerDemoPlaybackViewController. I turned it into an NSObject since I stripped all of the interface code out of it.
创建AVPlayer时,我在(void)prepareToPlayAsset:withKeys:方法中创建了AVPlayerLayer :(我只是将图层添加到NSView中以测试其是否正常工作.)
I create the AVPlayerLayer in the (void)prepareToPlayAsset:withKeys: method when I create the AVPlayer: (I'm only adding the layer to a NSView to test if it is working.)
if (![self player])
{
/* Get a new AVPlayer initialized to play the specified player item. */
[self setPlayer:[AVPlayer playerWithPlayerItem:self.mPlayerItem]];
/* Observe the AVPlayer "currentItem" property to find out when any
AVPlayer replaceCurrentItemWithPlayerItem: replacement will/did
occur.*/
[self.player addObserver:self
forKeyPath:kCurrentItemKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:AVPlayerDemoPlaybackViewControllerCurrentItemObservationContext];
mPlaybackView = [AVPlayerLayer playerLayerWithPlayer:self.player];
[self.theView setWantsLayer:YES];
[mPlaybackView setFrame:self.theView.layer.bounds];
[self.theView.layer addSublayer:mPlaybackView];
}
然后我创建一个NSRunLoop来每秒抓取30次AVPlayerLayer的帧:
I then create a NSRunLoop to grab a frame of the AVPlayerLayer 30 times per second:
framegrabTimer = [NSTimer timerWithTimeInterval:(1/30) target:self selector:@selector(grabFrameFromMovie) userInfo:nil repeats:YES];
[[NSRunLoop currentRunLoop] addTimer:framegrabTimer forMode:NSDefaultRunLoopMode];
这是我用来抓取框架并将其传递给处理QCRenderer的类的代码:
Here is the code I use to grab the frame and pass it to the class that handles the QCRenderer:
-(void)grabFrameFromMovie {
CGColorSpaceRef colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);
CGContextRef theContext = CGBitmapContextCreate(NULL, mPlaybackView.frame.size.width, mPlaybackView.frame.size.height, 8, 4*mPlaybackView.frame.size.width, colorSpace, kCGImageAlphaPremultipliedLast);
[mPlaybackView renderInContext:theContext];
CGImageRef CGImage = CGBitmapContextCreateImage(theContext);
NSImage *image = [[NSImage alloc] initWithCGImage:CGImage size:NSMakeSize(mPlaybackView.frame.size.width, mPlaybackView.frame.size.height)];
[[NSNotificationCenter defaultCenter] postNotificationName:@"AVPlayerLoadedNewFrame" object:[image copy]];
CGContextRelease(theContext);
CGColorSpaceRelease(colorSpace);
CGImageRelease(CGImage); }
我不知道为什么我只会变得很清楚.对此的任何帮助将不胜感激,因为没有足够的AVFoundation文档可用于OS X.
I can't figure out why I'm only getting clear. Any help with this is greatly appreciated, as there is not enough AVFoundation documentation for OS X.
推荐答案
对我有用:
AVAssetImageGenerator *gen = [[AVAssetImageGenerator alloc] initWithAsset:[[[self player] currentItem] asset]];
CGImageRef capture = [gen copyCGImageAtTime:self.player.currentTime actualTime:NULL error:NULL];
NSImage *img = [[NSImage alloc] initWithCGImage:capture size:self.playerView.frame.size];
这篇关于AVPlayer未呈现到其AVPlayerLayer的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!