将帧缓冲区数据从一个 WebGLRenderingContext 复制到另一个? [英] Copy framebuffer data from one WebGLRenderingContext to another?

查看:28
本文介绍了将帧缓冲区数据从一个 WebGLRenderingContext 复制到另一个?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如果以下内容没有太大意义,请参阅下面的背景部分,为了使问题尽可能清楚,我省略了大部分上下文.

我有两个具有以下特征的 WebGLRenderingContext:

I have two WebGLRenderingContexts with the following traits:

  • WebGLRenderingContext:InputGL(允许对其帧缓冲区进行读写操作.)
  • WebGLRenderingContext:OutputGL(允许在其帧缓冲区上只写操作.)
  • WebGLRenderingContext: InputGL (Allows read and write operations on its framebuffers.)
  • WebGLRenderingContext: OutputGL (Allows only write operations on its framebuffers.)

目标:在移动设备上,在 33 毫秒 (30fps) 内周期性地将 InputGL 的渲染叠加到 OutputGL 的渲染上.

GOAL: Superimpose InputGL's renders onto OutputGL's renders periodically within 33ms (30fps) on mobile.

InputGL 和OutputGL 的帧缓冲区都来自不同的进程.两者都在一个 window.requestAnimationFrame 回调中可用(并带有完整的帧缓冲区).由于InputGL需要读操作,而OutputGL只支持写操作,因此InputGL和OutputGL不能合并为一个WebGLRenderingContext.

Both the InputGL's and OutputGL's framebuffers get drawn to from separate processes. Both are available (and with complete framebuffers) within one single window.requestAnimationFrame callback. As InputGL requires read operations, and OutputGL only supportes write operations, InputGL and OutputGL cannot be merged into one WebGLRenderingContext.

因此,我想在每个 window.requestAnimationFrame 回调中将帧缓冲区内容从 InputGL 复制到 OutputGL.这允许我在 InputGL 上保持读/写支持,并且只在 OutputGL 上使用写.他们都没有附加(常规)画布,因此画布覆盖是不可能的.我有以下代码:

Therefore, I would like to copy the framebuffer content from InputGL to OutputGL in every window.requestAnimationFrame callback. This allows me to keep read/write supported on InputGL and only use write on OutputGL. Neither of them have (regular) canvasses attached so canvas overlay is out of the question. I have the following code:

// customOutputGLFramebuffer is the WebXR API's extended framebuffer which does not allow read operations

let fbo = InputGL.createFramebuffer();
InputGL.bindFramebuffer(InputGL.FRAMEBUFFER, fbo)

// TODO: Somehow get fbo data into OutputGL (I guess?)

OutputGl.bindFramebuffer(OutputGl.FRAMEBUFFER, customOutputGLFramebuffer);

// Drawing to OutputGL here works, and it gets drawn on top of the customOutputGLFramebuffer

我不确定这是否需要以某种特定顺序进行绑定,或者某种类型的纹理操作,对此的任何帮助将不胜感激.

I am not sure if this requires binding in some particular order, or some kind of texture manipulation of some sorts, any help with this would be greatly appreciated.

背景:我正在试验 Unity WebGL 与未发布的 WebXR API.WebXR 使用它自己的、修改过的 WebGLRenderingContext,它不允许从其缓冲区读取(作为隐私问题).但是,Unity WebGL 需要从其缓冲区读取.两者都在同一个 WebGLRenderingContext 上运行会导致 Unity 的读取操作出错,这意味着它们需要保持分开.这个想法是定期将 Unity 的帧缓冲区数据叠加到 WebXR 的帧缓冲区上.

Background: I am experimenting with Unity WebGL in combination with the unreleased WebXR API. WebXR uses its own, modified WebGLRenderingContext which disallows reading from its buffers (as a privacy concern). However, Unity WebGL requires reading from its buffers. Having both operate on the same WebGLRenderingContext gives errors on Unity's read operations, which means they need to be kept separate. The idea is to periodically superimpose Unity's framebuffer data onto WebXR's framebuffers.

如果需要,也支持 WebGL2.

推荐答案

您不能跨上下文共享资源.

You can not share resources across contexts period.

你能做的最好的事情是通过某种方法使用一种方法作为另一种方法的来源,通过 texImage2D

The best you can do is use one via some method as a source to the other via texImage2D

例如,如果上下文使用画布,则将帧缓冲区绘制到画布,然后

For example if the context is using a canvas then draw the framebuffer to the canvas and then

destContext.texImage2D(......., srcContext.canvas);

如果是 OffscreenRenderingContext 使用 transferToImageBitmap 然后将生成的位图传递给 texImage2D

If it's a OffscreenRenderingContext use transferToImageBitmap and then pass the resulting bitmap to texImage2D

这篇关于将帧缓冲区数据从一个 WebGLRenderingContext 复制到另一个?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆