同时录制覆盖AVFoundation iOS [英] Record overlay at the same time AVFoundation iOS

查看:145
本文介绍了同时录制覆盖AVFoundation iOS的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我现在已经完全设置了使用 AVFoundation 框架录制视频的功能,这一切都很好,但现在我希望在记录期间添加叠加层(也可见在 AVCaptureVideoPreviewLayer 图层)

I have now fully setup the ability to record video using the AVFoundation framework and this is all fine but now I am looking to add an overlay during the record (also visible on the AVCaptureVideoPreviewLayer Layer)

我可以添加此叠加层 UIView 反对VideoPreviewLayer,但我正在努力如何在录制的视频上获得相同的视图。这个 UIView 可以包含 UILabel s到 UIImageView s。

I can add this overlay UIView object to the VideoPreviewLayer but I am struggling how to get the same view to be on the recorded video.This UIView could contain anything from UILabels to UIImageViews.

推荐答案

我我不确定这是否是您正在寻找的东西,但我想您可以使用Brad Larson的GPU库,有一个名为GPUImageElement的类,它允许您添加叠加层和视图。请查看示例,尤其是名为Filter showcase的示例滚动到名为UIElement的东西。

I am not sure if this is the thing you are looking for but i guess you can use Brad Larson's GPU library,there is a class called GPUImageElement which lets you add overlays and views.Please check out the examples,especially the one called Filter showcase and scroll to something called UIElement.

以下是一些示例代码:

 else if (filterType == GPUIMAGE_UIELEMENT)
        {
            GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
            blendFilter.mix = 1.0;

            NSDate *startTime = [NSDate date];

            UILabel *timeLabel = [[UILabel alloc] initWithFrame:CGRectMake(0.0, 0.0, 240.0f, 320.0f)];
            timeLabel.font = [UIFont systemFontOfSize:17.0f];
            timeLabel.text = @"Time: 0.0 s";
            timeLabel.textAlignment = UITextAlignmentCenter;
            timeLabel.backgroundColor = [UIColor clearColor];
            timeLabel.textColor = [UIColor whiteColor];

            uiElementInput = [[GPUImageUIElement alloc] initWithView:timeLabel];

            [filter addTarget:blendFilter];
            [uiElementInput addTarget:blendFilter];

            [blendFilter addTarget:filterView];

            __unsafe_unretained GPUImageUIElement *weakUIElementInput = uiElementInput;

            [filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
                timeLabel.text = [NSString stringWithFormat:@"Time: %f s", -[startTime timeIntervalSinceNow]];
                [weakUIElementInput update];
            }];
        }

这篇关于同时录制覆盖AVFoundation iOS的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆