使用AVFoundation捕获视频时崩溃 [英] crashing on video capture with AVFoundation

查看:90
本文介绍了使用AVFoundation捕获视频时崩溃的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用AVFoundation在我的应用程序中实现视频捕获.我在viewDidLoad下有以下代码:

I am trying to implement video capture in my app using AVFoundation. I have the following code under viewDidLoad:

session = [[AVCaptureSession alloc] init];
movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
videoInputDevice = [[AVCaptureDeviceInput alloc] init];
AVCaptureDevice *videoDevice = [self frontFacingCameraIfAvailable];

if (videoDevice)
{
    NSError *error;

    videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];

    if (!error)
    {
        if ([session canAddInput:videoInputDevice])
            [session addInput:videoInputDevice];
        else 
            NSLog (@"Couldn't add input.");
    }

}


AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *audioError = nil;
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&audioError];

if (audioInput)
{
    [session addInput:audioInput];
}

movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];

Float64 TotalSeconds = 35;          //Total seconds
int32_t preferredTimeScale = 30;    //Frames per second
CMTime maxDuration = CMTimeMakeWithSeconds(TotalSeconds, preferredTimeScale);   
movieFileOutput.maxRecordedDuration = maxDuration;

movieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024;                        

if ([session canAddOutput:movieFileOutput])
    [session addOutput:movieFileOutput];

[session setSessionPreset:AVCaptureSessionPresetMedium];
if ([session canSetSessionPreset:AVCaptureSessionPreset640x480])        //Check size based configs are supported before setting them
    [session setSessionPreset:AVCaptureSessionPreset640x480];

[self cameraSetOutputProperties];


[session startRunning];

此代码在实现按钮的实现中,该按钮将开始捕获,

This code is in the implementation for a button that is to start the capture, among other things:

NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @"output.mov"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputPath])
{
    NSError *error;
    if ([fileManager removeItemAtPath:outputPath error:&error] == NO)
    {
        //Error - handle if requried
    }
}
[outputPath release];
//Start recording
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
[outputURL release];

当我尝试在设备上运行它时,当我尝试加载所有应该发生的视图时,它会崩溃. Xcode给我一个线程1:EXC_BAD_ACCESS(代码= 1,地址= 0x4),位于:

When I try to run it on a device, it crashes when I try to load the view that all of this is supposed to happen on. Xcode gives me a "Thread 1: EXC_BAD_ACCESS (code=1, address=0x4) at:

AVFoundation`-[AVCaptureDeviceInput _setDevice:]:
(stuff)
0x3793f608:  ldr    r0, [r1, r0]

该错误在最后一行给出.我认为这与某处的AVCaptureDeviceInput有关系,但是我正在空白中.有人知道我在这里缺少什么吗?谢谢.

The error is given at that last line. I assume this has something to do with the AVCaptureDeviceInput somewhere, but I am blanking on what it could be. Does anyone have any idea what I am missing here? Thanks.

在摆弄断点之后,我发现崩溃发生在此行:

After fiddling with breakpoints, I've figured out that the crash happens at this line:

AVCaptureDevice *videoDevice = [self frontFacingCameraIfAvailable];

与该方法有关系吗?这是我拥有的实现文件,也许那里有问题.

So something to do with that method? Here's the implementation file that I have for it, maybe something is wrong there.

NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices)
{
    if (device.position == AVCaptureDevicePositionFront)
    {
        captureDevice = device;
        break;
    }
}

//  couldn't find one on the front, so just get the default video device.
if ( ! captureDevice)
{
    captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}

return captureDevice;

可能是我正在使用

AVCaptureDevice *videoDevice = [self frontFacingCameraIfAvailable];

并且自我"以某种方式向其中投掷扳手?我知道创建CALayer时可以做到

and the 'self' is throwing a wrench in it somehow? I know that when creating a CALayer it is possible to do

CALayer *aLayer = [CALayer layer];

但是我不知道AVCaptureDevice的等效功能(如果有的话).我不确定还有什么可能,从所有方面来看,我的代码似乎还不错,并且我尝试过清理项目,重新启动Xcode,重新启动计算机等.

but I don't know the AVCaptureDevice equivalent of this, if there is one. I am not sure what else it could be, by all accounts my code seems fine and I've tried cleaning the project, restarting Xcode, restarting the computer, etc.

推荐答案

我很确定问题是您正在模拟器上运行程序.模拟器无法使用这些资源.

I'm pretty sure the problem is that you're running the program on the simulator. The simulator can't use those resources.

这篇关于使用AVFoundation捕获视频时崩溃的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆