拍摄带有硬件加速内容的WKWebview的屏幕截图 [英] Taking screenshot of WKWebview with hardware accelerated content

查看:691
本文介绍了拍摄带有硬件加速内容的WKWebview的屏幕截图的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当存在硬件加速内容(某些在iframe中运行的特定赌场游戏)时,我很难获取 WKWebview 内容的屏幕截图. 到目前为止,我采用了每个人都建议的截屏标准方法:

I am having serious trouble with taking screenshot of WKWebview content when there is hardware accelerated content (some specific casino games that are running inside iframe). So far I used the standard way of taking screenshot like everyone suggests:

UIGraphicsBeginImageContextWithOptions(containerView.frame.size, true, 0.0)

containerView.layer.render(in: UIGraphicsGetCurrentContext()!)

//This line helps to fix view rendering for taking screenshot on older iOS devices
containerView.drawHierarchy(in: containerView.bounds, afterScreenUpdates: true)

let image = UIGraphicsGetImageFromCurrentImageContext()!

UIGraphicsEndImageContext()

在我获得由GPU渲染的WKWebview内的某些内容之前,此方法非常有效. GPU渲染的内容在屏幕截图上显示为黑色.我尝试了使用此方法可能实现的所有技巧,但没有任何帮助.甚至XCode视图层次结构调试器也无法显示硬件加速的内容.因此,与Android类似,我需要另一种方式来截取屏幕截图. 我已经开始在Android上解决类似的问题,方法是开始记录屏幕上发生的一切,并在获得第一张图片后停止屏幕记录.

This method works really nicely until I get some content inside my WKWebview that is rendered by GPU. GPU rendered content appears black on a screenshot. I tried all the tricks that are possible with this method but nothing helps. Even the XCode view hierarchy debugger cant show that hardware accelerated content. Therefore, similar to Android, I need another way to take a screenshot. I solved the similar problem on Android already by starting to record everything that is happening on screen and stopping screen record after I got the first image.

我经历了无数的Stack Overflow问题和解决方案,但这些问题和解决方案大多都在Obj-C中(我完全很讨厌),已经过时或不够具体以满足我的需求.

I have went through numerous Stack Overflow questions and solutions but they are all mostly in Obj-C (which I totally suck at), outdated or just not specific enough for my needs.

现在我发现,我可以使用 glReadPixels 来从OpenGL读取像素(如果我的内容是硬件加速的,那么我可以从图形卡中读取这些像素是有意义的,对吗?)

Now what I found out, is that I can use glReadPixels to read pixels right out OpenGL (if my content is hardware accelerated it would make sense that I can read those pixels from graphics card then, right ??)

到目前为止,我已经成功创建了一个类似 renderBuffer-> frameBuffer-> glReadPixels-> image

So far I have managed to create a Swift snippet that does something like renderBuffer -> frameBuffer -> glReadPixels -> image

let width = Int(containerView.frame.size.width)
let height = Int(containerView.frame.size.height)

//BeginImageContext code was run above

let api = EAGLRenderingAPI.openGLES3
let context2 = EAGLContext(api: api)
EAGLContext.setCurrent(context2)

// Setup render buffer
var renderBuffer : GLuint = GLuint()

let size = GLsizei(10)
glGenRenderbuffers(size, &renderBuffer)
glBindRenderbuffer(GLenum(GL_RENDERBUFFER), renderBuffer)
let bufferWidth = GLsizei(width * 1)
let bufferHeight = GLsizei(height * 1)
let bufferFormat = GLenum(GL_RGBA8)
glRenderbufferStorage(GLenum(GL_RENDERBUFFER), bufferFormat, bufferWidth, bufferHeight)

// Setup frame buffer
var frameBuffer = GLuint()
glGenFramebuffers(GLsizei(10), &frameBuffer)
glBindFramebuffer(GLenum(GL_FRAMEBUFFER), frameBuffer)
glFramebufferRenderbuffer(GLenum(GL_FRAMEBUFFER), GLenum(GL_COLOR_ATTACHMENT0), GLenum(GL_RENDERBUFFER), renderBuffer)

// Draw
glReadBuffer(GLenum(GL_RENDERBUFFER))
glClearColor(0.1, 0.2, 0.3, 0.2)
glClear(GLbitfield(GL_COLOR_BUFFER_BIT))

//--------------
let bytes = malloc(width*height*4)
let bytes2 = malloc(width*height*4)

let x : GLint = GLint(0)
let y : GLint = GLint(0)
let w : GLsizei = GLsizei(width)
let h : GLsizei = GLsizei(height)

glReadPixels(x, y, w, h, GLenum(GL_RGBA), GLenum(GL_UNSIGNED_BYTE), bytes)

let data = NSData(bytes: bytes, length: width * height * 4)
let dataProvider = CGDataProvider(data: data)
//let dataProvider2 = CGDataProvider(dataInfo: nil, data: bytes!, size: width * height * 4, releaseData: )
let colorspace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo: CGBitmapInfo = [.byteOrder32Little, CGBitmapInfo(rawValue: CGImageAlphaInfo.last.rawValue)]
let aCGImage = CGImage(
    width: Int(width),
    height: Int(height),
    bitsPerComponent: 8,
    bitsPerPixel: 32,
    bytesPerRow: 4 * Int(width),
    space: colorspace,
    bitmapInfo: bitmapInfo,
    provider: dataProvider!,
    decode: nil, 
    shouldInterpolate: true,
    intent: .defaultIntent
    )!
let imaag = UIImage(cgImage: aCGImage)
//I get the image of the same color that is defined at clearColor

现在我的问题是,我走的路正确吗? 是否有可能以某种方式获取我的WKWebview(或拍摄快照并将其放入UIView)来renderBuffer/frameBuffer,以便 glReadPixels可以实际读取屏幕上的内容?

Now my question is, am I going the right way? Is it possible that I can get my WKWebview (or take a snapshot of it and make it into UIView) somehow to renderBuffer / frameBuffer so that glReadPixels would ACTUALLY READ what is on the screen?

PS!我已经看到了很多有关从CAEAGLView中获取UIImage的问题,但就我而言,它不是CAEGLView而是WKWebview. 非常感激.

PS! I have seen numerous questions about getting UIImage out of CAEAGLView, but in my case, its not CAEGLView but WKWebview instead. Much appreciated.

推荐答案

我有一种简单的方法来拍摄快照,有代码.

I have a easy way to take the snapshot, there is the code.

- (void)screenLoogShoot:(void (^)(UIImage *fullImage))snapShotHandler {
    WKWebView *webView = self.webView;
    UIScrollView *scrollView = webView.scrollView;
    CGFloat boundsWidth = scrollView.bounds.size.width;
    CGFloat contentHeight = scrollView.contentSize.height;
    CGFloat scale = [UIScreen mainScreen].scale;
    CGRect oldFrame = self.webView.frame;

    if (@available(iOS 11.0, *)) {
        self.webView.frame = CGRectMake(0, 0, boundsWidth, contentHeight);
        dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.3 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
            WKSnapshotConfiguration *configuration = [WKSnapshotConfiguration new];
            configuration.rect = CGRectMake(0, 0, boundsWidth, contentHeight);
            configuration.snapshotWidth = @(boundsWidth);
            [self.webView takeSnapshotWithConfiguration:configuration completionHandler:^(UIImage * _Nullable snapshotImage, NSError * _Nullable error) {
                UIGraphicsBeginImageContextWithOptions(CGSizeMake(boundsWidth, contentHeight), NO, scale);
                [snapshotImage drawInRect:CGRectMake(0, 0, boundsWidth, contentHeight)]; 
                UIImage *fullImage = UIGraphicsGetImageFromCurrentImageContext();
                UIGraphicsEndImageContext();
                self.webView.frame = oldFrame;
                snapShotHandler(fullImage);
            }];
        });
    } else {
        self.webView.frame = CGRectMake(0, 0, boundsWidth, contentHeight);
        dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.3 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
            UIGraphicsBeginImageContextWithOptions(CGSizeMake(webView.bounds.size.width, webView.bounds.size.height), NO, scale);
            [webView drawViewHierarchyInRect:CGRectMake(0, 0, boundsWidth, contentHeight) afterScreenUpdates:YES];
            UIImage *fullImage = UIGraphicsGetImageFromCurrentImageContext();
            UIGraphicsEndImageContext();
            self.webView.frame = oldFrame;
            !snapShotHandler ? : snapShotHandler(fullImage);
        });
    }
}

这篇关于拍摄带有硬件加速内容的WKWebview的屏幕截图的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆