为什么在iOS 13中将Yp Cb Cr图像缓冲区全部改组? [英] Why is Yp Cb Cr image buffer all shuffled in iOS 13?

查看:69
本文介绍了为什么在iOS 13中将Yp Cb Cr图像缓冲区全部改组?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个计算机视觉应用程序,可以从传感器获取灰度图像并进行处理. iOS的图像采集是用Obj-C编写的,图像处理是使用OpenCV在C ++中执行的.因为我只需要亮度数据,所以我以YUV(或Yp Cb Cr)420双平面全范围格式获取图像,并将缓冲区的数据分配给OpenCV Mat对象(请参见下面的获取代码).到目前为止,这种方法效果很好,直到出现了全新的iOS 13为止.由于某种原因,在iOS 13上,我获得的图像未对齐,从而导致对角线条纹.通过查看我获得的图像,我怀疑这是缓冲区的Y Cb和Cr分量的顺序更改或缓冲区的步幅变化的结果.有谁知道iOS 13是否引入了这种更改以及如何更新代码来避免这种更改,最好以向后兼容的方式进行?

I have a computer vision app that takes grayscale images from sensor and processes them. The image acquisition for iOS is written in Obj-C and the image processing is performed in C++ using OpenCV. As I only need the luminance data, I acquire the image in YUV (or Yp Cb Cr) 420 bi-planar full range format and just assign the buffer's data to an OpenCV Mat object (see aquisition code below). This worked great so far, until the brand new iOS 13 came out... For some reason, on iOS 13 the image I obtain is misaligned, resulting in diagonal stripes. By looking at the image I obtain, I suspect this is the consequence of a change in ordering of the buffer's Y Cb an Cr components or a change in the buffer's stride. Does anyone know if iOS 13 introduces this kind of changes and how I could update my code to avoid this, preferably in a backward-compatible manner?

这是我的图像获取代码:

Here is my image acquisition code:

//capture config
- (void)initialize {
    AVCaptureDevice *frontCameraDevice;
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice *device in devices) {
        if (device.position == AVCaptureDevicePositionFront) {
            frontCameraDevice = device;
        }
    }
    if (frontCameraDevice == nil) {
        NSLog(@"Front camera device not found");
        return;
    }

    _session = [[AVCaptureSession alloc] init];
    _session.sessionPreset = AVCaptureSessionPreset640x480;

    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:frontCameraDevice error: &error];
    if (error != nil) {
        NSLog(@"Error getting front camera device input: %@", error);
    }
    if ([_session canAddInput:input]) {
        [_session addInput:input];
    } else {
        NSLog(@"Could not add front camera device input to session");
    }

    AVCaptureVideoDataOutput *videoOutput = [[AVCaptureVideoDataOutput alloc] init];
    // This is the default, but making it explicit
    videoOutput.alwaysDiscardsLateVideoFrames = YES;

    if ([videoOutput.availableVideoCVPixelFormatTypes containsObject:
                      [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]]) {
        OSType format = kCVPixelFormatType_420YpCbCr8BiPlanarFullRange;
        videoOutput.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithUnsignedInt:format]
                                                                 forKey:(id)kCVPixelBufferPixelFormatTypeKey];
    } else {
        NSLog(@"YUV format not available");
    }

    [videoOutput setSampleBufferDelegate:self queue:dispatch_queue_create("extrapage.camera.capture.sample.buffer.delegate", DISPATCH_QUEUE_SERIAL)];
    if ([_session canAddOutput:videoOutput]) {
        [_session addOutput:videoOutput];
    } else {
        NSLog(@"Could not add video output to session");
    }

    AVCaptureConnection *captureConnection = [videoOutput connectionWithMediaType:AVMediaTypeVideo];
    captureConnection.videoOrientation = AVCaptureVideoOrientationPortrait;
}

//acquisition code 
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

    if (_listener != nil) {
        CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        OSType format = CVPixelBufferGetPixelFormatType(pixelBuffer);

        NSAssert(format == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, @"Only YUV is supported");

        // The first plane / channel (at index 0) is the grayscale plane
        // See more infomation about the YUV format
        // http://en.wikipedia.org/wiki/YUV
        CVPixelBufferLockBaseAddress(pixelBuffer, 0);
        void *baseaddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);

        CGFloat width = CVPixelBufferGetWidth(pixelBuffer);
        CGFloat height = CVPixelBufferGetHeight(pixelBuffer);

        cv::Mat frame(height, width, CV_8UC1, baseaddress, 0);

        [_listener onNewFrame:frame];

        CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
    }
}

推荐答案

我找到了解决此问题的方法.这是一个行跨度问题:显然,在iOS 13中,更改了Yp Cb Cr 4:2:0 8位双平面缓冲区的行跨度.也许总是2的幂.因此,在某些情况下,行跨度不再与宽度相同.对我来说就是这样.修复很容易,只需从缓冲区的信息中获得行跨度,然后将其传递给OpenCV Mat的构造函数,如下所示.

I found the solution to this problem. It was a row stride issue: appearently, in iOS 13, the row stride of the Yp Cb Cr 4:2:0 8 bit bi-planar buffer was changed. Maybe for it to always be a power of 2. Therefore in some cases, the row stride is no longer the same as the width. It was the case for me. The fix is easy, just get the row stride from the buffer's info and pass it to the OpenCV Mat's constructor as shown below.

void *baseaddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);         
size_t width = CVPixelBufferGetWidthOfPlane(pixelBuffer, 0);
size_t height = CVPixelBufferGetHeightOfPlane(pixelBuffer, 0);
size_t bytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 0);

cv::Mat frame(height, width, CV_8UC1, baseaddress, bytesPerRow);

请注意,我还通过使用平面的尺寸而不是缓冲区的尺寸来更改了宽度和高度的获取方式.对于Y平面,它应该始终相同.我不确定这会有所不同.

Note that I also changed how I get the width and height by using the dimensions of the plane instead of the ones of the buffer. For the Y plane, it should always be the same. I am not sure that this makes a difference.

也要小心:在更新Xcode以支持iOS 13 SDK之后,我不得不从测试设备上卸载我的应用程序,因为否则,Xcode会继续运行旧版本,而不是新编译的版本.

Also be careful: after the Xcode update to support the iOS 13 SDK, I had to uninstall my app from the test device because otherwise, Xcode kept running the old version instead of the newly compiled one.

这篇关于为什么在iOS 13中将Yp Cb Cr图像缓冲区全部改组?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆