屏幕截图,包括带有叠加按钮的AVCaptureVideoPreviewLayer [英] Screen Capture including AVCaptureVideoPreviewLayer with overlay Buttons

查看:130
本文介绍了屏幕截图,包括带有叠加按钮的AVCaptureVideoPreviewLayer的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用屏幕录像机来对屏幕进行字幕.当在iPhone屏幕上填充视图时,这是完美的工作方式.当AVCaptureVideoPreviewLayer与叠加按钮一起显示时,则保存的屏幕捕获视频显示不带AVCaptureVideoPreviewLayer的叠加按钮.我已经使用教程来添加叠加层.该如何解决?

I am using screen Recorder to capute screen. It is perfectly working when a view was filled in iphone screen. when the AVCaptureVideoPreviewLayer was displayed with overlay buttons, then the saved screen captured video shows overlay buttons without AVCaptureVideoPreviewLayer. I have used this tutorial for adding overlays. How to fix this?

推荐答案

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{

    @autoreleasepool {

            if ([connection isVideoOrientationSupported])
                [connection setVideoOrientation:[self cameraOrientation]];

        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        /*Lock the image buffer*/
        CVPixelBufferLockBaseAddress(imageBuffer,0);
        /*Get information about the image*/
        uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);

        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);

        /*Create a CGImageRef from the CVImageBufferRef*/
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
        CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
        CGImageRef newImage = CGBitmapContextCreateImage(newContext);

        /*We release some components*/

        CVPixelBufferUnlockBaseAddress(imageBuffer,0);

        CGContextRelease(newContext);
        CGColorSpaceRelease(colorSpace);

        UIImage *image= [UIImage imageWithCGImage:newImage scale:1.0 orientation:UIImageOrientationRight];
        image1= [UIImage imageWithCGImage:newImage];
        /*We relase the CGImageRef*/

        CGImageRelease(newImage);


        dispatch_sync(dispatch_get_main_queue(), ^{
            [self.imageView setImage:image1];
        });

    }

}

使用NSTimer运行writeaSample.

-(void) writeSample: (NSTimer*) _timer {

    if (assetWriterInput.readyForMoreMediaData) {
        // CMSampleBufferRef sample = nil;
        @autoreleasepool {
            CVReturn cvErr = kCVReturnSuccess;

            // get screenshot image!

            UIGraphicsBeginImageContext(baseViewOne.frame.size);
            [[baseViewOne layer] renderInContext:UIGraphicsGetCurrentContext()];
            screenshota = UIGraphicsGetImageFromCurrentImageContext();
            UIGraphicsEndImageContext();


            //CGImageRef image = (CGImageRef) [[self screenshot] CGImage];
            CGImageRef image = (CGImageRef) [screenshota CGImage];
            //NSLog (@"made screenshot");

            // prepare the pixel buffer
            CVPixelBufferRef pixelBuffer = NULL;
            CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image));
            //NSLog (@"copied image data");
            cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                                 baseViewOne.frame.size.width,baseViewOne.frame.size.height,
                                                 kCVPixelFormatType_32BGRA,
                                                 (void*)CFDataGetBytePtr(imageData),
                                                 CGImageGetBytesPerRow(image),
                                                 NULL,
                                                 NULL,
                                                 NULL,
                                                 &pixelBuffer);
            //NSLog (@"CVPixelBufferCreateWithBytes returned %d", cvErr);

            // calculate the time
            CMTime presentationTime;

                CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
                elapsedTime = thisFrameWallClockTime - (firstFrameWallClockTime+pausedFrameTime);
                // NSLog (@"elapsedTime: %f", elapsedTime);
                presentationTime =  CMTimeMake (elapsedTime * TIME_SCALE, TIME_SCALE);
                BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];

                if (appended) {
                    CVPixelBufferRelease( pixelBuffer );
                    CFRelease(imageData);
                    pixelBuffer = nil;
                    //NSLog (@"appended sample at time %lf", CMTimeGetSeconds(presentationTime));
                } else {
                    [self stopRecording];
                }



        }
    }


}

这篇关于屏幕截图,包括带有叠加按钮的AVCaptureVideoPreviewLayer的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆