使用ArCore时是否可以捕获高分辨率图像? [英] Is it possible to capture a High-Res image while using ArCore?

查看:250
本文介绍了使用ArCore时是否可以捕获高分辨率图像?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在我的应用中,我试图在自定义相机视图中将ArCore用作相机助手".

In my app I'm trying to use ArCore as sort of a "camera assistant" in a custom camera view.

要明确-我想在用户的相机中显示该用户的图像,并让他捕获包含AR模型的图像.

To be clear - I want to display images for the user in his camera and have him capture images that don't contain the AR models.

据我了解,为了使用ArCore捕获图像,我必须使用Camera2 API,该API是通过将会话配置为使用共享相机"而启用的.

From what I understand, in order to capture an image with ArCore I'll have to use the Camera2 API which is enabled by configuring the session to use the "shared Camera".

但是,我似乎无法将相机配置为使用任何高端分辨率(我使用的是像素3,因此我应该能够将像素提高到12MP).

However, I can't seem to configure the camera to use any high-end resolutions (I'm using pixel 3 so I should be able to go as high as 12MP).

在共享相机示例"中,它们在Camera2和ArCore之间切换(可惜没有CameraX的API),并且它有几个问题:

In the "shared camera example", they toggle between Camera2 and ArCore (a shame there's no API for CameraX) and it has several problems:

  1. 在ArCore模式下,图像模糊(我认为这是因为深度传感器已按照其文档中的说明被禁用)
  2. 在Camera2模式下,我根本无法提高分辨率.
  3. 在显示来自ArCore的模型时,我无法使用Camera2 API捕获图像.

此要求目前是否有可能?

Is this requirement at all possible at the moment?

推荐答案

我尚未使用ARCore的共享摄像头,但是我可以说几句有关您问题要点的内容.

I have not worked yet with shared camera with ARCore, but I can say a few things regarding the main point of your question.

在ARCore中,您可以配置CPU图像大小和GPU图像大小.为此,您可以检查所有可用的摄像机配置(可通过 Session.getSupportedCameraConfigs(CameraConfigFilter cameraConfigFilter)获得),然后将其传递回ARCore会话来选择您喜欢的摄像机.在每个 CameraConfig 上,您可以检查哪个CPU图像大小和GPU纹理大小.

In ARCore you can configure both CPU image size and GPU image size. You can do that by checking all available camera configurations (available through Session.getSupportedCameraConfigs(CameraConfigFilter cameraConfigFilter)) and selecting your preferred one by passing it back to the ARCore Session. On each CameraConfig you can check which CPU image size and GPU texture size you will get.

可能您当前正在使用(也许是默认设置?)具有最低CPU图像的CameraConfig,如果我没有记错的话,则为640x480像素,因此,是的,渲染时肯定看起来很模糊(但与深度传感器无关).

Probably you are currently using (maybe by default?) a CameraConfig with the lowest CPU image, 640x480 pixels if I remember correctly, so yes it definitely looks blurry when rendered (but nothing to do with depth sensor in this regard).

听起来像您可以选择一个更高的CPU映像,您就可以了...但是不幸的是,事实并非如此,因为该配置适用于每帧.获取更高分辨率的CPU映像将导致性能大大降低.当我对此进行测试时,我在测试设备上每秒获得大约3-4帧的帧,这绝对不理想.

Sounds like you could just select a higher CPU image and you're good to go... but unfortunately that's not the case because that configuration applies to every frame. Getting higher resolution CPU images will result in much lower performance. When I tested this I got about 3-4 frames per second on my test device, definitely not ideal.

那现在呢?我认为您有2个选择:

So now what? I think you have 2 options:

  1. 暂停ARCore会话,切换到较高的CPU图像一帧,获得该图像并切换回正常"图像.配置.
  2. 可能您已经获得了不错的GPU图像,也许不是由于Camera Preview而获得的最好图像,但希望足够好?不确定如何渲染,但是您可以使用一些OpenGL技能来复制该纹理.当然,由于整个 GL_TEXTURE_EXTERNAL_OES 问题,并不是直接发生的,而是将其渲染到另一个帧缓冲区中,然后读取附加到其上的纹理可能有效.当然,您可能需要自己处理纹理坐标(完整图像与可见区域),但这是另一个主题.
  1. Pause the ARCore session, switch to a higher CPU image for 1 frame, get the image and switch back to the "normal" configuration.
  2. Probably you are already getting a nice GPU image, maybe not the best due to camera Preview, but hopefully good enough? Not sure how you are rendering it, but with some OpenGL skills you can copy that texture. Not directly, of course, because of the whole GL_TEXTURE_EXTERNAL_OES thing... but rendering it onto another framebuffer and then reading the texture attached to it could work. Of course you might need to deal with texture coordinates yourself (full image vs visible area) but that's another topic.

关于CameraX,请注意,它包装了Camera2 API以提供一些相机用例,以便应用程序开发人员不必担心相机的生命周期.据我了解,ARCore不适合使用CameraX,因为我认为他们需要完全控制相机.

Regarding CameraX, note that it is wrapping Camera2 API in order to provide some camera use cases so that app developers don't have to worry about the camera lifecycle. As I understand it would not be suitable for ARCore to use CameraX as I imagine they need full control of the camera.

我希望能有所帮助!

这篇关于使用ArCore时是否可以捕获高分辨率图像?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆