从AVPlayerItemVideoOutput到openGL Texture的最佳路径 [英] Best path from AVPlayerItemVideoOutput to openGL Texture

查看:269
本文介绍了从AVPlayerItemVideoOutput到openGL Texture的最佳路径的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在尝试找出从AVFoundation视频到openGLTexture的当前最佳方法之前,我发力了,我发现大部分内容与iOS相关,而且我似乎无法使其在OSX中正常工作.

Been pulling my hair out trying to figure out the current best path from AVFoundation videos to an openGLTexture, most of what I find is related to iOS, and I can't seem to make it work well in OSX.

首先,这是我设置videoOutput的方式:

First of all, this is how I set up the videoOutput:

NSDictionary *pbOptions = [NSDictionary dictionaryWithObjectsAndKeys:
                          [NSNumber numberWithInt:kCVPixelFormatType_422YpCbCr8], kCVPixelBufferPixelFormatTypeKey,
                          [NSDictionary dictionary], kCVPixelBufferIOSurfacePropertiesKey,
                                   nil];
        self.playeroutput   = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pbOptions];
        self.playeroutput.suppressesPlayerRendering = YES;

我正在尝试三种不同的解决方案,其中只有一种似乎能够始终如一地工作,但不确定是否最快.一个工作了一段时间,然后分解为到处跳动的帧,而一个刚好产生黑色.

I'm attempting three different solutions, of which only one seems to work consistently, but not sure it's the fastest. One works for a little, then breaks down with frames jumping all over the place, and one just produces black.

首先,使用glTexImage2D解决方案

First off, working solution using glTexImage2D

- (BOOL) renderWithCVPixelBufferForTime: (NSTimeInterval) time
{
    CMTime vTime = [self.playeroutput itemTimeForHostTime:CACurrentMediaTime()];

    if ([self.playeroutput hasNewPixelBufferForItemTime:vTime]) {
        if (_cvPixelBufferRef) {
            CVPixelBufferUnlockBaseAddress(_cvPixelBufferRef, kCVPixelBufferLock_ReadOnly);
            CVPixelBufferRelease(_cvPixelBufferRef);
        }
        _cvPixelBufferRef = [self.playeroutput copyPixelBufferForItemTime:vTime itemTimeForDisplay:NULL];

        CVPixelBufferLockBaseAddress(_cvPixelBufferRef, kCVPixelBufferLock_ReadOnly);
        GLsizei       texWidth    = CVPixelBufferGetWidth(_cvPixelBufferRef);
        GLsizei       texHeight   = CVPixelBufferGetHeight(_cvPixelBufferRef);
        GLvoid *baseAddress = CVPixelBufferGetBaseAddress(_cvPixelBufferRef);


        glBindTexture(GL_TEXTURE_RECTANGLE_ARB, self.textureName);
        glTexParameteri(GL_TEXTURE_RECTANGLE_ARB, GL_TEXTURE_STORAGE_HINT_APPLE , GL_STORAGE_CACHED_APPLE);
        glTexImage2D(GL_TEXTURE_RECTANGLE_ARB, 0, GL_RGB, texWidth, texHeight, 0, GL_YCBCR_422_APPLE, GL_UNSIGNED_SHORT_8_8_APPLE, baseAddress);

        glBindTexture(GL_TEXTURE_RECTANGLE_ARB, 0);
    } 
    return YES;
} 

此方法花费了大部分时间来锁定像素缓冲区的基地址,但是文档称如果从GPU访问数据则不需要此方法,这会降低性能.我无法找出一种无需锁定即可获取纹理的方法.

This method spends most of it's time Locking the base address of the pixel buffer, but the docs say it's not required if accessing data from the GPU and can impair performance. I could not figure out a way to get a texture without locking.

接下来,这是使用iOSurface的几乎可行的解决方案,它工作了一会儿,然后变得有点毛病,就像以前的帧中正在使用ioSurfaces一样:

Next up, the almost working solution using iOSurface, this works for a bit then gets really glitchy, as if ioSurfaces are being used from previous frames:

- (BOOL) renderWithIOSurfaceForTime:(NSTimeInterval) time {
    CMTime vTime = [self.playeroutput itemTimeForHostTime:CACurrentMediaTime()];

    if ([self.playeroutput hasNewPixelBufferForItemTime:vTime]) {

         CVPixelBufferRef pb = [self.playeroutput copyPixelBufferForItemTime:vTime itemTimeForDisplay:NULL];
         IOSurfaceRef newSurface = CVPixelBufferGetIOSurface(pb);
         if (_surfaceRef != newSurface) {
            IOSurfaceDecrementUseCount(_surfaceRef);
            _surfaceRef = newSurface;
            IOSurfaceIncrementUseCount(_surfaceRef);
            GLsizei       texWidth = (int) IOSurfaceGetWidth(_surfaceRef);
            GLsizei       texHeight= (int) IOSurfaceGetHeight(_surfaceRef);
            size_t        rowbytes = CVPixelBufferGetBytesPerRow(_cvPixelBufferRef);

            glBindTexture(GL_TEXTURE_RECTANGLE_ARB, self.textureName);
            CGLTexImageIOSurface2D(cgl_ctx, GL_TEXTURE_RECTANGLE_ARB, GL_RGB8, texWidth, texHeight, GL_YCBCR_422_APPLE, GL_UNSIGNED_SHORT_8_8_APPLE, _surfaceRef, 0);
            glBindTexture(GL_TEXTURE_RECTANGLE_ARB, 0);
        }
        CVPixelBufferRelease(pb);   
    }
    return YES;
}

如果可行,这似乎是最好的解决方案.我还有另一个从ioSurfaces创建纹理的过程,它工作得很好,而且速度也非常快.

This seems to be the best solution, if it would work. I have another process that creates textures from ioSurfaces and it works just fine, while being extremely fast too.

最后,似乎推荐给iOS使用的是使用CVOpenGLTextureCache,在osx中​​的实现似乎略有不同,除了黑色以外,我无法呈现它,而且它似乎比第一种解决方案还慢....

Finally the one that seems recommended for iOS is to use a CVOpenGLTextureCache, implementation in osx seems slightly different, and I could not get it to render anything but black, plus it seemed even slower than the first solution....

- (BOOL) renderByCVOpenGLTextureCacheForTime:(NSTimeInterval) time
{
    CMTime vTime = [self.playeroutput itemTimeForHostTime:CACurrentMediaTime()];

    if ([self.playeroutput hasNewPixelBufferForItemTime:vTime]) {
        _cvPixelBufferRef = [self.playeroutput copyPixelBufferForItemTime:vTime itemTimeForDisplay:NULL];
        if (!_textureCacheRef) {
            CVReturn error = CVOpenGLTextureCacheCreate(kCFAllocatorDefault, NULL, cgl_ctx, CGLGetPixelFormat(cgl_ctx), NULL, &_textureCacheRef);
            if (error) {
                NSLog(@"Texture cache create failed");
            }
        }

        CVReturn error = CVOpenGLTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _textureCacheRef, _cvPixelBufferRef, NULL, &_textureRef);
        if (error) {
            NSLog(@"Failed to copy video texture");
        }


        CVPixelBufferRelease(_cvPixelBufferRef);

        _textureName = CVOpenGLTextureGetName(_textureRef);

    }
    return YES;
}

可能我的设置不正确,osx中的纹理缓存文档为零.

Probably I'm not setting things up right, there's zero documentation for the texture cache in osx.

我发现最好在渲染周期之间保留cvpixelbufferref,据我了解,纹理上传可以与CGLTexImage2d异步运行,对此我感到非常满意,可以同时渲染其他几个对象,最终绘制纹理时,将最终调用cglflushDrawable.

I've found it best to retain the cvpixelbufferref between render cycles, as I understand it, the texture upload can run asynchronously with CGLTexImage2d, I'm quite happy with that, several other objects may be rendered at the same time, a cglflushDrawable is eventually called when textures are eventually drawn.

我在视频中找到的大多数苹果示例都将openGL Textures与iOS有关,并将纹理一分为二以在着色器中重新组合,例如在此示例中

Most of the apple examples I find for video to openGL Textures relate to iOS, and split the texture in two to recombine in a shader, like in this example https://developer.apple.com/library/ios/samplecode/GLCameraRipple/Listings/GLCameraRipple_main_m.html#//apple_ref/doc/uid/DTS40011222-GLCameraRipple_main_m-DontLinkElementID_11 I couldn't adapt the code directly as the texture cache has different implementations in iOS.

所以任何指针都很棒,似乎是至关重要的功能,但是我发现有关osx上的av基础和opengl的信息似乎非常负面.

So any pointers would be great, it seems like vital functionality, but info I find regarding av foundation and opengl on osx seems very negative.

更新:使用次数更新了ioSurface代码,工作时间稍长,但最终仍会出现故障.

Update: Updated ioSurface code with use counts, works slightly longer, but still glitches out eventually.

推荐答案

我正在同一旅程上,对OpenGL的了解与对绵羊饲养的了解一样多,但是确实注意到您的pbOptions不包含

I'm starting on the same journey and know as much about OpenGL as I do about sheep farming, but did notice that your pbOptions doesn't contain kCVPixelBufferOpenGLCompatibilityKey

NSDictionary *pbOptions = [NSDictionary dictionaryWithObjectsAndKeys:
    [NSNumber numberWithInt:kCVPixelFormatType_422YpCbCr8], kCVPixelBufferPixelFormatTypeKey,
    [NSDictionary dictionary], kCVPixelBufferIOSurfacePropertiesKey,
    [NSNumber numberWithBool:YES], kCVPixelBufferOpenGLCompatibilityKey, nil];

我将像素缓冲区要求为kCVPixelFormatType_32BGRA而不是组件,这对我来说适用于_currentSurface(IOSurfaceRef),textureName(GLuint),_ sourceWidth(int)和_sourceHeight(int)的局部变量

I'm requesting the pixel buffer as kCVPixelFormatType_32BGRA rather than component and this works for me with local variables for _currentSurface (IOSurfaceRef), textureName (GLuint), _sourceWidth (int) and _sourceHeight (int)

IOSurfaceRef newSurface = CVPixelBufferGetIOSurface(pixelBuffer);
if (_currentSurface != newSurface) {
    CGLContextObj  cgl_ctx = (CGLContextObj)[[self openGLContext] CGLContextObj];
    [[self openGLContext] makeCurrentContext];

    IOSurfaceDecrementUseCount(_currentSurface);
    _currentSurface = newSurface;
    IOSurfaceIncrementUseCount(_currentSurface);
    GLsizei texWidth = (int) IOSurfaceGetWidth(_currentSurface);
    GLsizei texHeight = (int) IOSurfaceGetHeight(_currentSurface);

    if (_sourceWidth == 0 && _sourceHeight == 0) {
        // used during drawing of texture
        _sourceWidth = texWidth;
        _sourceHeight = texHeight;
    }

    if (!_textureName) {
        GLuint name;
        glGenTextures(1, &name);
        _textureName = name;
    }

    glBindTexture(GL_TEXTURE_RECTANGLE_ARB, _textureName);
    CGLTexImageIOSurface2D(cgl_ctx, GL_TEXTURE_RECTANGLE_ARB, GL_RGBA, texWidth, texHeight, GL_BGRA, GL_UNSIGNED_INT_8_8_8_8_REV, self.currentSurface, 0);        
    glBindTexture(GL_TEXTURE_RECTANGLE_ARB, 0);
}

这篇关于从AVPlayerItemVideoOutput到openGL Texture的最佳路径的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆