从CMSampleBuffer中提取数据以创建深层副本 [英] Pulling data from a CMSampleBuffer in order to create a deep copy

查看:741
本文介绍了从CMSampleBuffer中提取数据以创建深层副本的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在AVCaptureVideoDataOutputSampleBufferDelegate中创建由captureOutput返回的CMSampleBuffer副本。



由于CMSampleBuffers来自预先分配的(15)缓冲池,如果我附上对他们的引用,他们就不能被重新收集。这会导致所有剩余的帧被丢弃。


为了保持最佳性能,一些样本缓冲区直接引用可能需要重用的内存池由设备系统和其他捕获输入。对于未压缩的设备本机捕获,通常会出现这种情况,其中尽可能少地复制内存块。如果多个样本缓冲区长时间引用此类内存池,则输入将无法再将新样本复制到内存中,并且这些样本将被丢弃。



如果您的应用程序通过长时间保留提供的CMSampleBufferRef对象导致样本被删除,但是需要长时间访问样本数据,考虑将数据复制到新缓冲区然后释放样本缓冲区(如果之前是保留)以便它引用的内存可以重用。


显然我必须复制CMSampleBuffer但CMSampleBufferCreateCopy()只会创建一个浅的复制。因此,我得出结论,我必须使用CMSampleBufferCreate()。我填写了12个!构造函数需要的参数但遇到了我的CMSampleBuffers不包含blockBuffer的问题(不完全确定那是什么但似乎很重要)。



这个问题已经存在问了几次但没有回答。



CMImageBuffer或CVImageBuffer的深层复制创建Swift 2.0中CMSampleBuffer的副本



一个可能的答案是我终于想出如何使用它来创建深度克隆。所有复制方法都重复使用保留的堆中的数据将锁定AVCaptureSession。所以我不得不将数据拉出到NSMutableData对象中,然后创建一个新的样本缓冲区。 向Rob致信!但是,我不知道该怎么做。



如果你有兴趣,这个 print(sampleBuffer)的输出。没有提到blockBuffer,又名CMSampleBufferGetDataBuffer返回nil。有一个imageBuffer,但使用CMSampleBufferCreateForImageBuffer创建一个副本似乎也没有释放CMSampleBuffer。






编辑:由于这个问题已经发布,我一直在尝试更多复制内存的方法。



我做了同样的事情,用户 Kametrixom 尝试过。 这是我尝试同样的想法,首先复制CVPixelBuffer,然后使用CMSampleBufferCreateForImageBuffer创建最终样本缓冲区。但是,这会导致以下两个错误之一:




  • memcpy指令上的EXC_BAD_ACCESS。尝试访问应用程序内存之外的AKA是一个分段错误。

  • 或者,内存将成功复制,但 CMSampleBufferCreateReadyWithImageBuffer()将失败,结果代码为-12743,表示给定媒体的格式与给定的格式描述不匹配。例如,格式描述与CVImageBuffer配对,CMVideoFormatDescriptionMatchesImageBuffer失败。



你可以看到Kametrixom和我都使用 CMSampleBufferGetFormatDescription(sampleBuffer)来尝试复制源缓冲区的格式描述。因此,我不确定为什么给定媒体的格式与给定的格式描述不匹配。

解决方案

好吧,我我想我终于明白了。我创建了一个帮助扩展来制作 CVPixelBuffer的完整副本

 扩展名CVPixelBuffer {
func copy() - > CVPixelBuffer {
前置条件(CFGetTypeID(self)== CVPixelBufferGetTypeID(),copy()无法在非CVPixelBuffer上调用)

var _copy:CVPixelBuffer?
CVPixelBufferCreate(
nil,
CVPixelBufferGetWidth(self),
CVPixelBufferGetHeight(self),
CVPixelBufferGetPixelFormatType(self),
CVBufferGetAttachments(self,kCVAttachmentMode_ShouldPropagate)? .takeUnretainedValue(),
& _copy)

guard let copy = _copy else {fatalError()}

CVPixelBufferLockBaseAddress(self,kCVPixelBufferLock_ReadOnly)
CVPixelBufferLockBaseAddress(copy,0)

for plane in 0 ..< CVPixelBufferGetPlaneCount(self){
let dest = CVPixelBufferGetBaseAddressOfPlane(copy,plane)
let source = CVPixelBufferGetBaseAddressOfPlane(self ,飞机)
let height = CVPixelBufferGetHeightOfPlane(self,plane)
let bytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(self,plane)

memcpy(dest,source,height * bytesPerRow)
}

CVPixelBufferUnlockBaseAddress(copy,0)
CVPixelBufferUnlockBaseAddress(self,kCVPixelBufferLock_ReadOnly)

返回副本
}
}

现在你可以在 didOutputSampleBuffer 方法中使用它:

  guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)else {return} 

let copy = pixelBuffer.copy()

toProcess.append(copy)

但请注意,一个像pixelBuffer需要大约3MB的内存(1080p),这意味着在100帧中你已经有大约300MB,这大约是iPhone说STAHP(和崩溃)的时间点。



<请注意,您实际上并不想复制 CMSampleBuffer ,因为它实际上只包含 CVPixelBuffer ,因为它是一个图片。


I am trying to create a copy of a CMSampleBuffer as returned by captureOutput in a AVCaptureVideoDataOutputSampleBufferDelegate.

Since the CMSampleBuffers come from a preallocated pool of (15) buffers, if I attach a reference to them they cannot be recollected. This causes all remaining frames to be dropped.

To maintain optimal performance, some sample buffers directly reference pools of memory that may need to be reused by the device system and other capture inputs. This is frequently the case for uncompressed device native capture where memory blocks are copied as little as possible. If multiple sample buffers reference such pools of memory for too long, inputs will no longer be able to copy new samples into memory and those samples will be dropped.

If your application is causing samples to be dropped by retaining the provided CMSampleBufferRef objects for too long, but it needs access to the sample data for a long period of time, consider copying the data into a new buffer and then releasing the sample buffer (if it was previously retained) so that the memory it references can be reused.

Obviously I must copy the CMSampleBuffer but CMSampleBufferCreateCopy() will only create a shallow copy. Thus I conclude that I must use CMSampleBufferCreate(). I filled in the 12! parameters that the constructor needs but ran into the problem that my CMSampleBuffers do not contain a blockBuffer (not entirely sure what that is but it seems important).

This question has been asked several times but not answered.

Deep Copy of CMImageBuffer or CVImageBuffer and Create a copy of CMSampleBuffer in Swift 2.0

One possible answer is "I finally figured out how to use this to create a deep clone. All the copy methods reused the data in the heap which kept would lock the AVCaptureSession. So I had to pull the data out into a NSMutableData object and then created a new sample buffer." credit to Rob on SO. However, I do not know how to do this correcly.

If you are interested, this is the output of print(sampleBuffer). There is no mention of blockBuffer, aka CMSampleBufferGetDataBuffer returns nil. There is a imageBuffer, but creating a "copy" using CMSampleBufferCreateForImageBuffer does not seem to free the CMSampleBuffer either.


EDIT: Since this question has been posted I have been trying even more ways of copying the memory.

I did the same thing that user Kametrixom tried. This is my attempt at the same idea, to first copy the CVPixelBuffer then use CMSampleBufferCreateForImageBuffer to create the final sample buffer. However this results in one of two error:

  • A EXC_BAD_ACCESS on the memcpy instruction. AKA a segfault from trying to access outside of the application's memory.
  • Or, the memory will copy successfully but the CMSampleBufferCreateReadyWithImageBuffer() will fail with result code -12743 which "Indicates that the format of the given media does not match the given format description. For example, a format description paired with a CVImageBuffer that fails CMVideoFormatDescriptionMatchesImageBuffer."

You can see that both Kametrixom and I did use CMSampleBufferGetFormatDescription(sampleBuffer) to try to copy the source buffer's format description. Thus, I'm not sure why the format of the given media does not match the given format description.

解决方案

Alright, I think I finally got it. I created a helper extension to make a full copy of a CVPixelBuffer:

extension CVPixelBuffer {
    func copy() -> CVPixelBuffer {
        precondition(CFGetTypeID(self) == CVPixelBufferGetTypeID(), "copy() cannot be called on a non-CVPixelBuffer")

        var _copy : CVPixelBuffer?
        CVPixelBufferCreate(
            nil,
            CVPixelBufferGetWidth(self),
            CVPixelBufferGetHeight(self),
            CVPixelBufferGetPixelFormatType(self),
            CVBufferGetAttachments(self, kCVAttachmentMode_ShouldPropagate)?.takeUnretainedValue(),
            &_copy)

        guard let copy = _copy else { fatalError() }

        CVPixelBufferLockBaseAddress(self, kCVPixelBufferLock_ReadOnly)
        CVPixelBufferLockBaseAddress(copy, 0)

        for plane in 0..<CVPixelBufferGetPlaneCount(self) {
            let dest = CVPixelBufferGetBaseAddressOfPlane(copy, plane)
            let source = CVPixelBufferGetBaseAddressOfPlane(self, plane)
            let height = CVPixelBufferGetHeightOfPlane(self, plane)
            let bytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(self, plane)

            memcpy(dest, source, height * bytesPerRow)
        }

        CVPixelBufferUnlockBaseAddress(copy, 0)
        CVPixelBufferUnlockBaseAddress(self, kCVPixelBufferLock_ReadOnly)

        return copy
    }
}

Now you can use this in your didOutputSampleBuffer method:

guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }

let copy = pixelBuffer.copy()

toProcess.append(copy)

But be aware, one such pixelBuffer takes up about 3MB of memory (1080p), which means that in 100 frames you got already about 300MB, which is about the point at which the iPhone says STAHP (and crashes).

Note that you don't actually want to copy the CMSampleBuffer since it only really contains a CVPixelBuffer because it's an image.

这篇关于从CMSampleBuffer中提取数据以创建深层副本的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆