在没有压缩的情况下捕获UIImage(来自CMSampleBufferRef)? [英] Capture still UIImage without compression (from CMSampleBufferRef)?

查看:538
本文介绍了在没有压缩的情况下捕获UIImage(来自CMSampleBufferRef)?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要从CMSampleBufferRef的未压缩图像数据中获取 UIImage 。我正在使用代码:

I need to obtain the UIImage from uncompressed image data from CMSampleBufferRef. I'm using the code:

captureStillImageOutput captureStillImageAsynchronouslyFromConnection:connection
 completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) 
{
    // that famous function from Apple docs found on a lot of websites
    // does NOT work for still images
    UIImage *capturedImage = [self imageFromSampleBuffer:imageSampleBuffer]; 
}

http://developer.apple.com/library/ios/#qa/qa1702/_index.html 是指向<的链接code> imageFromSampleBuffer function。

http://developer.apple.com/library/ios/#qa/qa1702/_index.html is a link to imageFromSampleBuffer function.

但它无法正常工作。 :(

But it does not work properly. :(

有一个 jpegStillImageNSDataRepresentation:imageSampleBuffer 方法,但是它提供了压缩数据(好吧,因为JPEG) 。

There is a jpegStillImageNSDataRepresentation:imageSampleBuffer method, but it gives the compressed data (well, because JPEG).

如何在捕获静止图像后使用最原始的非压缩数据创建 UIImage

How can I get UIImage created with the most raw non-compressed data after capturing Still Image?

也许,我应该为视频输出指定一些设置?我目前正在使用这些设置:

Maybe, I should specify some settings to video output? I'm currently using those:

captureStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
captureStillImageOutput.outputSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };

我注意到,该输出的默认值为 AVVideoCodecKey ,即 AVVideoCodecJPEG 。在拍摄静止图像时能否以任何方式避免,或甚至重要

I've noticed, that output has a default value for AVVideoCodecKey, which is AVVideoCodecJPEG. Can it be avoided in any way, or does it even matter when capturing still image?

I在那里找到了一些东西:来自相机的原始图像数据,如645 PRO,但我只需要一个UIImage,而不使用OpenCV或OGLES或其他第三方。

I found something there: Raw image data from camera like "645 PRO" , but I need just a UIImage, without using OpenCV or OGLES or other 3rd party.

推荐答案

方法 imageFromSampleBuffer 确实有效我正在使用它的更改版本,但是如果我没记错的话你需要设置outputSettings吧。我认为您需要将密钥设置为 kCVPixelBufferPixelFormatTypeKey ,并将值设置为 kCVPixelFormatType_32BGRA

The method imageFromSampleBuffer does work in fact I'm using a changed version of it, but if I remember correctly you need to set the outputSettings right. I think you need to set the key as kCVPixelBufferPixelFormatTypeKey and the value as kCVPixelFormatType_32BGRA.

例如:

NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;                                 
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];                
NSDictionary* outputSettings = [NSDictionary dictionaryWithObject:value forKey:key];

[newStillImageOutput setOutputSettings:outputSettings];

编辑

我正在使用这些设置来拍摄静态图像而非视频。
你的sessionPreset AVCaptureSessionPresetPhoto?可能存在问题

I am using those settings to take stillImages not video. Is your sessionPreset AVCaptureSessionPresetPhoto? There may be problems with that

AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
[newCaptureSession setSessionPreset:AVCaptureSessionPresetPhoto];

编辑2

将其保存到UIImage的部分与文档中的部分相同。这就是我要问问题的其他根源的原因,但我想这只是抓住了吸管。
我知道另一种方式,但这需要OpenCV。

The part about saving it to UIImage is identical with the one from the documentation. That's the reason I was asking for other origins of the problem, but I guess that was just grasping for straws. There is another way I know of, but that requires OpenCV.

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

CVPixelBufferLockBaseAddress(imageBuffer, 0);

void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);



// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

// Create a bitmap graphics context with the sample buffer data
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                             bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);


// Free up the context and color space
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);

// Create an image object from the Quartz image
UIImage *image = [UIImage imageWithCGImage:quartzImage];

// Release the Quartz image
CGImageRelease(quartzImage);

return (image);

}

我想这对你没有帮助,对不起。我不知道你的问题的其他起源。

I guess that is of no help to you, sorry. I don't know enough to think of other origins for your problem.

这篇关于在没有压缩的情况下捕获UIImage(来自CMSampleBufferRef)?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆