Unity Hand Control - Kinect SDK 2 [英] Unity Hand Control - Kinect SDK 2

查看:59
本文介绍了Unity Hand Control - Kinect SDK 2的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

你好,


我想实现手动控制,就像演示"控制基础"一样。在SDK示例中项目Kinect SDK v2。但是,示例WPF / XAML项目依赖于似乎没有移植到Microsoft
提供的Unity插件的功能,例如KinectCoreWindow.SetKinectPersonManualEngagement BodyHandPairs等...并非所有命名空间都可用?所有我看到的是Windows.Kinect


在windows v2上观看有关编程kinect的channel9视频(http://channel9.msdn.com/Series/Programming-Kinect-for-Windows- v2)我给人的印象是提供了这个功能。我错过了什么或者这个手动控制功能
由SetKinectPersonManualEngagement提供 不是当前提供的,我需要自己使用更低级别的API功能来实现这个功能吗?


谢谢你的支持时间

解决方案

没有"付费游戏"诸如Interactions,Face / FaceHD,Fusion之类的api已被移植到Unity,但是。有一些工作要将Fusion和Face API引入Unity,但我还不知道它的状态。


由于Unity没有标准的UI图层,因此交互更加困难。我知道最近他们宣布它是v4的测试版。  Kinect Interactions框架旨在扩展,以便可以为其他库启用它。


它们将构建在Microsoft.Kinect.WPF.Controls.dll中附带的Microsoft.Kinect.Toolkit.Input命名空间中的类型之上。这必须包含在Unity本身,因为它不能支持.Net库。


此控制层的工作是收听指针事件(请参阅上面的答案),在给定位置的ui frame workd 窗口中点击测试。 。并创建一个HitTestResult。


它应该创建PressableModels(用于可压缩的东西)和ManipulatableModels(用于可滚动的东西),并在这些模型所持有的KinectGestureRecognizer上设置适当的KinectGestureRecognizerSettings。 p>

它将创建一个InputPointerManager实例,并使用适当的参数调用HandlePointerAsCursor,包括上面引用的HitTestResult。


该函数将使pointerPoint适当路由基于命中测试和捕获的可压缩/可操作控件的正确手势识别器。


您可以查看此线程以获得更多洞察力,但了解,您的框架将使用此需要支持Microsoft .Net。


http://social.msdn.microsoft.com/Forums/en-US/6b8d6251-c59a-46c7-9da8-b912cb16dfab/kinectregionaddhandpointerhandler?forum=kinectv2sdk



Hello,

I want to implement hand control like the demo "Controls Basics" in the SDK sample projects Kinect SDK v2. However the sample WPF/XAML projects relies on functionality that does not appear to be ported over to the Unity Plugin provided by Microsoft for example KinectCoreWindow.SetKinectPersonManualEngagement BodyHandPairs etc... Not all namespaces appear available? all i an see is Windows.Kinect

Watching the channel9 videos on programming kinect for windows v2 (http://channel9.msdn.com/Series/Programming-Kinect-for-Windows-v2) i was given the impression that this functionality was provided. Am i missing something or is this hand control functionality provided by SetKinectPersonManualEngagement not current provided and i would need to implement this functionality myself using lower level API functionality ?

Thanks for your time

解决方案

None of the "Pay for Play" api's such as Interactions, Face/FaceHD, Fusion have been ported to the Unity, yet. There is some work to bring Fusion and Face API's to Unity, but I don't know the state of that yet.

Interactions is a bit more difficult since Unity didn't have a standard UI layer. I know just recently they announced it as a beta for v4. The Kinect Interactions framework is designed to be extended so it can be enabled for other libraries.

They would be built on top of the types in Microsoft.Kinect.Toolkit.Input namespace that ship in Microsoft.Kinect.WPF.Controls.dll. This would have to be wrapped for Unity itself since it can't support .Net libraries.

The job for this controls layer is to listen to pointer events (see my answer above), to hit test in a ui frame workd window given that location ... and create a HitTestResult.

It should create PressableModels (for pressable things) and ManipulatableModels (for scrollable things) and set the appropriate KinectGestureRecognizerSettings on the KinectGestureRecognizer that those Models hold.

It would create an InputPointerManager instance, and call HandlePointerAsCursor with the appropriate parameters, including a HitTestResult, referenced above.

That function will appropriate route the pointerPoint to the right gesture recognizers based on both hit testing and captured pressable/manipulatable controls.

You can review this thread to get more insight, but understand, your framework that would use this would need to support Microsoft .Net.

http://social.msdn.microsoft.com/Forums/en-US/6b8d6251-c59a-46c7-9da8-b912cb16dfab/kinectregionaddhandpointerhandler?forum=kinectv2sdk


这篇关于Unity Hand Control - Kinect SDK 2的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆