使用Unity和Tango'Leibniz'访问色框 [英] Accessing Color Frames with Unity and Tango 'Leibniz'

查看:92
本文介绍了使用Unity和Tango'Leibniz'访问色框的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我刚刚开始修改Tango和Unity.不幸的是,关于最新版本的Unity中如何访问颜色数据,似乎没有任何文档或示例.

我一直在使用来自GitHub的运动跟踪示例( https://github.com/googlesamples/tango-examples-unity )作为起点,尝试以与读取姿势和深度数据相同的方式读取传入的彩色帧.我假设最好的方法是通过"ITangoVideoOverlay"接口和"OnTangoImageAvailableEventHandler"回调.

我现在想要做的就是让"OnTangoImageAvailableEventHandler"回调正常工作,而我还不太清楚.我在Tango Manager上选中了启用视频覆盖",并将以下脚本连接到GUI文本对象进行调试.

using System.Collections;
using UnityEngine;
using UnityEngine.UI;
using Tango;
using System;

public class VideoController : MonoBehaviour, ITangoVideoOverlay
{

    private TangoApplication m_tangoApplication;

    private bool debugB = false;
    public Text debugText;

    // Use this for initialization
    void Start () {
        m_tangoApplication = FindObjectOfType<TangoApplication>();
        m_tangoApplication.Register(this);
    }

    // Update is called once per frame
    void Update () {
        if (debugB)
            debugText.text = "TRUE";
        else
            debugText.text = "FALSE";
    }

    // No Unity API
    public void OnTangoImageAvailableEventHandler(Tango.TangoEnums.TangoCameraId id, Tango.TangoUnityImageData image)
    {
        debugB = true;
    }
}

我缺少一些相机初始化吗?或者仍然是首选方法,仍然可以像下面的旧代码一样使用VideoOverlayListener:在Unity中获取颜色数据

我知道也可以通过Unity直接访问相机(禁用深度).但是我想首先学习正确的方法".

谢谢您的时间!

更新15年4月28日-最新版本的脚本,回调有效!仍然需要转换为RGB颜色

此脚本是GitHub上Google的Tango Motion Tracking示例的补充.将脚本附加到Unity摄像机,然后将公共字段"m_viewScreen"链接到网格对象(如飞机),以在其上显示视频纹理.

using System.Collections;
using UnityEngine;
using Tango;
using System;

public class VideoController : MonoBehaviour
{
    private TangoApplication m_tangoApplication;
    private Texture2D m_texture;
    private Material m_screenMaterial;
    private MeshFilter m_meshFilter;
    private bool m_readyToDraw = false;

    // Link to a mesh object for displaying texture
    public GameObject m_viewScreen;

    // Use this for initialization
    void Start ()
    {
        // Tango initilization
        m_tangoApplication = FindObjectOfType<TangoApplication>();
        m_tangoApplication.Register(this);
        m_tangoApplication.RegisterPermissionsCallback(_OnTangoApplicationPermissionsEvent);

        // Initialize view object Material
        m_meshFilter = m_viewScreen.GetComponent<MeshFilter> ();
        m_screenMaterial = new Material(Shader.Find("Mobile/Unlit (Supports Lightmap)"));

        // Begin to texture to webcam
        m_texture = m_tangoApplication.GetVideoOverlayTexture();
        m_texture.Apply();

        if (m_screenMaterial != null)
        {
            // Apply the texture
            m_screenMaterial.mainTexture = m_texture;
            m_meshFilter.GetComponent<MeshRenderer>().material = m_screenMaterial;

            // Connect the texture to the camera
            if (m_tangoApplication.m_useExperimentalVideoOverlay)
            {
                VideoOverlayProvider.ExperimentalConnectTexture(TangoEnums.TangoCameraId.TANGO_CAMERA_COLOR, m_texture.GetNativeTextureID(), _OnUnityFrameAvailable);
            }
            else
            {
                VideoOverlayProvider.ConnectTexture(TangoEnums.TangoCameraId.TANGO_CAMERA_COLOR, m_texture.GetNativeTextureID());
            }
        }
    }

    private void _OnTangoApplicationPermissionsEvent(bool permissionsGranted)
    {
        m_readyToDraw = true;
    }

    private void _OnUnityFrameAvailable(System.IntPtr callbackContext, Tango.TangoEnums.TangoCameraId cameraId)
    {
        // Do fun stuff here!

    }

    void OnPreRender()
    {
        if (m_readyToDraw)
        {
            VideoOverlayProvider.RenderLatestFrame(TangoEnums.TangoCameraId.TANGO_CAMERA_COLOR);
            GL.InvalidateState();
        }
    }

    void Update ()
    {
        // Do other fun stuff here!
    }
}

解决方案

据我所知,他们改变了整个系统,因此更容易访问图像,其中所有图像抓取都由Tango对象处理. >

在获取Tango App之后的开始中,尝试以下操作:

m_texture = m_tangoApplication.GetVideoOverlayTexture();
m_texture.Apply();

if (m_screenMaterial != null)
{
    // Apply the texture
    m_screenMaterial.mainTexture = m_texture;
    m_meshFilter.GetComponent<MeshRenderer>().material = m_screenMaterial;

    // Connect the texture to the camera
    if (m_tangoApplication.m_useExperimentalVideoOverlay)
    {
        VideoOverlayProvider.ExperimentalConnectTexture(TangoEnums.TangoCameraId.TANGO_CAMERA_COLOR, m_texture.GetNativeTextureID(), _OnUnityFrameAvailable);
    }
    else
    {
        VideoOverlayProvider.ConnectTexture(TangoEnums.TangoCameraId.TANGO_CAMERA_COLOR, m_texture.GetNativeTextureID());
    }

}

,您将需要在其他地方进行回调事件:

private void _OnUnityFrameAvailable(System.IntPtr callbackContext, Tango.TangoEnums.TangoCameraId cameraId)
{
    // Do fun stuff here!
}

但是它实际上并不需要做任何事情.当然,该图像仍然是YUV-NV12格式,因此您需要将其转换为RGB(或等待直到下一个版本将其修复).

糟糕!忘记了您需要再打一次电话来实际更新AR材质上的纹理:

在获取TangoApp之后在Start()中:

m_tangoApplication.RegisterPermissionsCallback(_OnTangoApplicationPermissionsEvent);

然后:

private void _OnTangoApplicationPermissionsEvent(bool permissionsGranted)
{
    m_readyToDraw = true;
}

void OnPreRender()
{
    if (m_readyToDraw)
    {
        VideoOverlayProvider.RenderLatestFrame(TangoEnums.TangoCameraId.TANGO_CAMERA_COLOR);
        GL.InvalidateState();
    }
}

希望现在可以使用!

I'm just starting to tinker with Tango and Unity. Unfortunately there doesn't seem to be any documentation or examples on how to access color data in Unity with the latest release.

I've been using the motion tracking example from GitHub (https://github.com/googlesamples/tango-examples-unity) as a starting point, trying to read incoming color frames the same way pose and depth data are read. I'm assuming the best way is to go through the "ITangoVideoOverlay" interface and "OnTangoImageAvailableEventHandler" callback.

All I am trying to do right now is to get the "OnTangoImageAvailableEventHandler" callback working and I can't quite figure it out. I have "Enable Video Overlay" checked on the Tango Manager and the following script connected to a GUI Text object for debugging.

using System.Collections;
using UnityEngine;
using UnityEngine.UI;
using Tango;
using System;

public class VideoController : MonoBehaviour, ITangoVideoOverlay
{

    private TangoApplication m_tangoApplication;

    private bool debugB = false;
    public Text debugText;

    // Use this for initialization
    void Start () {
        m_tangoApplication = FindObjectOfType<TangoApplication>();
        m_tangoApplication.Register(this);
    }

    // Update is called once per frame
    void Update () {
        if (debugB)
            debugText.text = "TRUE";
        else
            debugText.text = "FALSE";
    }

    // No Unity API
    public void OnTangoImageAvailableEventHandler(Tango.TangoEnums.TangoCameraId id, Tango.TangoUnityImageData image)
    {
        debugB = true;
    }
}

Is there some initialization of the camera that I am missing? Or is the preferred method still to use the VideoOverlayListener like in this older code: Getting color data in Unity

I know it's also possible to directly access the camera through Unity (disabling depth). But I would like to learn the "proper way" first.

Thank you for your time!

Update 04/28/15 - Latest version of the script, callback works! Still needs a conversion to RGB color

This script was written as an addition to Google's Tango Motion Tracking example on GitHub. Attach the script to a Unity camera and then link the public field "m_viewScreen" to a mesh object (like a plane) for the video texture to display on.

using System.Collections;
using UnityEngine;
using Tango;
using System;

public class VideoController : MonoBehaviour
{
    private TangoApplication m_tangoApplication;
    private Texture2D m_texture;
    private Material m_screenMaterial;
    private MeshFilter m_meshFilter;
    private bool m_readyToDraw = false;

    // Link to a mesh object for displaying texture
    public GameObject m_viewScreen;

    // Use this for initialization
    void Start ()
    {
        // Tango initilization
        m_tangoApplication = FindObjectOfType<TangoApplication>();
        m_tangoApplication.Register(this);
        m_tangoApplication.RegisterPermissionsCallback(_OnTangoApplicationPermissionsEvent);

        // Initialize view object Material
        m_meshFilter = m_viewScreen.GetComponent<MeshFilter> ();
        m_screenMaterial = new Material(Shader.Find("Mobile/Unlit (Supports Lightmap)"));

        // Begin to texture to webcam
        m_texture = m_tangoApplication.GetVideoOverlayTexture();
        m_texture.Apply();

        if (m_screenMaterial != null)
        {
            // Apply the texture
            m_screenMaterial.mainTexture = m_texture;
            m_meshFilter.GetComponent<MeshRenderer>().material = m_screenMaterial;

            // Connect the texture to the camera
            if (m_tangoApplication.m_useExperimentalVideoOverlay)
            {
                VideoOverlayProvider.ExperimentalConnectTexture(TangoEnums.TangoCameraId.TANGO_CAMERA_COLOR, m_texture.GetNativeTextureID(), _OnUnityFrameAvailable);
            }
            else
            {
                VideoOverlayProvider.ConnectTexture(TangoEnums.TangoCameraId.TANGO_CAMERA_COLOR, m_texture.GetNativeTextureID());
            }
        }
    }

    private void _OnTangoApplicationPermissionsEvent(bool permissionsGranted)
    {
        m_readyToDraw = true;
    }

    private void _OnUnityFrameAvailable(System.IntPtr callbackContext, Tango.TangoEnums.TangoCameraId cameraId)
    {
        // Do fun stuff here!

    }

    void OnPreRender()
    {
        if (m_readyToDraw)
        {
            VideoOverlayProvider.RenderLatestFrame(TangoEnums.TangoCameraId.TANGO_CAMERA_COLOR);
            GL.InvalidateState();
        }
    }

    void Update ()
    {
        // Do other fun stuff here!
    }
}

解决方案

From what I could find, they changed the whole system so it's easier to access the image, where all the image grabbing is handle by the Tango object.

In your Start after grabbing the Tango App, try this:

m_texture = m_tangoApplication.GetVideoOverlayTexture();
m_texture.Apply();

if (m_screenMaterial != null)
{
    // Apply the texture
    m_screenMaterial.mainTexture = m_texture;
    m_meshFilter.GetComponent<MeshRenderer>().material = m_screenMaterial;

    // Connect the texture to the camera
    if (m_tangoApplication.m_useExperimentalVideoOverlay)
    {
        VideoOverlayProvider.ExperimentalConnectTexture(TangoEnums.TangoCameraId.TANGO_CAMERA_COLOR, m_texture.GetNativeTextureID(), _OnUnityFrameAvailable);
    }
    else
    {
        VideoOverlayProvider.ConnectTexture(TangoEnums.TangoCameraId.TANGO_CAMERA_COLOR, m_texture.GetNativeTextureID());
    }

}

and you'll need this somewhere else for the callback event:

private void _OnUnityFrameAvailable(System.IntPtr callbackContext, Tango.TangoEnums.TangoCameraId cameraId)
{
    // Do fun stuff here!
}

But it doesn't really need to do anything. Of course, the image is still in YUV-NV12 format so you'll need to convert it to RGB (or wait til their next release which should fix it).

Edit: Oops! Forgot you need one more call to actually update the texture on the AR material:

In Start() after grabbing the TangoApp:

m_tangoApplication.RegisterPermissionsCallback(_OnTangoApplicationPermissionsEvent);

Then:

private void _OnTangoApplicationPermissionsEvent(bool permissionsGranted)
{
    m_readyToDraw = true;
}

void OnPreRender()
{
    if (m_readyToDraw)
    {
        VideoOverlayProvider.RenderLatestFrame(TangoEnums.TangoCameraId.TANGO_CAMERA_COLOR);
        GL.InvalidateState();
    }
}

Hope that works now!

这篇关于使用Unity和Tango'Leibniz'访问色框的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆