拍摄照片时切换AVCaptureSession预设 [英] Switching AVCaptureSession preset when capturing a photo

查看:504
本文介绍了拍摄照片时切换AVCaptureSession预设的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我目前的设置如下(基于 Brad Larson 的ColorTrackingCamera项目) ):

My current setup is as follows (based on the ColorTrackingCamera project from Brad Larson):

我正在使用 AVCaptureSession 设置为 AVCaptureSessionPreset640x480 我让输出通过OpenGL场景作为纹理运行。然后这个纹理由片段着色器操纵。

I'm using a AVCaptureSession set to AVCaptureSessionPreset640x480 for which I let the output run through an OpenGL scene as a texture. This texture is then manipulated by a fragment shader.

我需要这个低质量预设,因为我想在用户预览时保留高帧率。然后我想在用户拍摄静态照片时切换到更高质量的输出。

I'm in need of this "lower quality" preset because I want to preserve a high framerate when the user is previewing. I then want to switch to a higher quality output when the user captures a still photo.

首先我想我可以更改 sessionPreset AVCaptureSession 上,但这会强制相机重新聚焦哪个会破坏可用性。

First I thought I could change the sessionPreset on the AVCaptureSession but this forces the camera to refocus which break usability.

[captureSession beginConfiguration];
captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
[captureSession commitConfiguration];

目前我正在尝试添加第二个 AVCaptureStillImageOutput 到AVCaptureSession,但我得到一个空像素缓冲,所以我觉得我有点卡住了。

Currently I'm trying to add a second AVCaptureStillImageOutput to the AVCaptureSession but I'm getting an empty pixelbuffer, so I think I'm kinda stuck.

这是我的会话设置代码:

Here's my session setup code:

...

// Add the video frame output
[captureSession beginConfiguration];

videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[videoOutput setAlwaysDiscardsLateVideoFrames:YES];
[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
[videoOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

if ([captureSession canAddOutput:videoOutput])
{
    [captureSession addOutput:videoOutput];
}
else
{
    NSLog(@"Couldn't add video output");
}

[captureSession commitConfiguration];



// Add still output
[captureSession beginConfiguration];
stillOutput = [[AVCaptureStillImageOutput alloc] init];

if([captureSession canAddOutput:stillOutput])
{
    [captureSession addOutput:stillOutput];
}
else
{
    NSLog(@"Couldn't add still output");
}

[captureSession commitConfiguration];



// Start capturing
[captureSession setSessionPreset:AVCaptureSessionPreset640x480];
if(![captureSession isRunning])
{
    [captureSession startRunning];
};

...

这是我的捕获方法:

- (void)prepareForHighResolutionOutput
{
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in stillOutput.connections) {
        for (AVCaptureInputPort *port in [connection inputPorts]) {
            if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection) { break; }
    }

    [stillOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:
     ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
         CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(imageSampleBuffer);
         CVPixelBufferLockBaseAddress(pixelBuffer, 0);
         int width = CVPixelBufferGetWidth(pixelBuffer);
         int height = CVPixelBufferGetHeight(pixelBuffer);

         NSLog(@"%i x %i", width, height);
         CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
     }];
}

width height 结果是0)

我已经阅读了AVFoundation文档的文档,但似乎我我没有得到必要的东西。

I've read through the documents of the AVFoundation documentation but it seems I'm not getting something essential.

推荐答案

我找到了针对我的具体问题的解决方案。如果有人遇到同样的问题,我希望它可以用作指南。

I found the solution for my specific problem. I hope it can be used as a guide if someone stumbles upon the same problem.

帧率大幅下降的原因与像素格式之间的内部转换有关。显式设置pixelformat之后,帧率增加。

The reason why the framerate dropped significantly had to do with an internal conversion between pixel formats. After setting the pixelformat explicitly the framerate increased.

在我的情况下,我使用以下方法创建BGRA纹理:

In my situation, I was creating a BGRA texture with the following method:

// Let Core Video create the OpenGL texture from pixelbuffer
CVReturn err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, videoTextureCache, pixelBuffer, NULL,
                                                            GL_TEXTURE_2D, GL_RGBA, width, height, GL_BGRA,
                                                            GL_UNSIGNED_BYTE, 0, &videoTexture);

因此,当我设置 AVCaptureStillImageOutput 实例时将我的代码更改为:

So when I setup the AVCaptureStillImageOutput instance I changed my code to:

// Add still output
stillOutput = [[AVCaptureStillImageOutput alloc] init];
[stillOutput setOutputSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];

if([captureSession canAddOutput:stillOutput])
{
    [captureSession addOutput:stillOutput];
}
else
{
    NSLog(@"Couldn't add still output");
}

我希望有一天能帮到某人;)

I hope this helps someone someday ;)

这篇关于拍摄照片时切换AVCaptureSession预设的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆