Kinect中的手指跟踪 [英] Finger tracking in Kinect

查看:266
本文介绍了Kinect中的手指跟踪的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在探索kinect的开发,并希望能够识别手指,而不是只识别整个手. kinect官方SDK的骨骼API仅具有手关节-没有手指跟踪的规定.我还读到微软最近在新的sdk中包括了抓握识别API,并且在将来的版本中可能包括了手指跟踪功能.

I was exploring development on kinect, and wanted to be able to recognize fingers rather than the entire hand only. The skeletal API by kinect official SDK only has the hand joint - no provisions for finger tracking. I also read that very recently Microsoft has included the grip recognition API in the new sdk and might include finger tracking in future releases.

我的问题有当前资源,我该如何进行手指跟踪?是否有相同的外部库?考虑到UX准则不鼓励这种手势,使用kinect实际实现手指跟踪是否可行?

My question is given the current resources, how do i go about to do finger tracking ? Do we have external libraries for the same ? Will it be feasible to actually implement finger tracking using kinect, given the fact the UX guidelines discourage such gestures.

谢谢.

推荐答案

我该如何进行手指跟踪?我们是否有相同的外部库?

how do i go about to do finger tracking? Do we have external libraries for the same?

有几个项目可以证明Kinect进行手指跟踪的能力.还存在一些提供某种手指跟踪API的第三方库.

There are several projects out there that demonstrate the ability of the Kinect to perform finger tracking. Some 3rd party libraries, offering a finger tracking API of some sort, also exist.

这是一个非常有趣的代码,我通过一个简单的网络搜索就发现了:

Here a very interesting one, with code, I found with a simple web search:

如果您要使用正式的SDK或其他SDK之一,则是另一个问题.没有什么可以阻止官方SDK进行手部跟踪,但是其中没有任何内置工具可以执行此类操作.

If you are wanting to use the official SDK or one of the other SDKs is another concern. There is nothing stopping the official SDK from performing hand tracking, but there is nothing built into it for performing such actions.

考虑到UX准则不鼓励这种手势,使用kinect实际实现手指跟踪是否可行?

Will it be feasible to actually implement finger tracking using kinect, given the fact the UX guidelines discourage such gestures.

如果可行"则表示可能-是.没有什么可以阻止您使用官方SDK来实现自己的手指跟踪机制.

If by "feasible" you mean possible - Yes. There is nothing preventing you from implementing your own finger tracking mechanism using the official SDK.

另一方面,如果您的意思是跟踪手指与身体总体运动的UX实用性,则这留给您的应用程序设计.为了进行手指跟踪而进行手指跟踪并不能带来良好的无控件交互体验. Kinect for Windows 1.7人机界面指南很好地说明了用户与屏幕的距离如何影响与屏幕的最佳交互.请注意,我上面链接到的示例中的用户非常靠近屏幕.

If, on the other hand, you mean the UX practicality of tracking fingers vs. gross body movements, that is something left to your application design. Having finger tracking for the sake of finger tracking does not make a good control-free interactive experience. The "Distance-Dependent Interactions" sections of the Kinect for Windows 1.7 Human Interface Guidelines does a good job of illustrating how a user's distance from the screen effects how best to interact with it. Notice the user in the example I link to above is very close to the screen.

您的应用程序将要做什么;用户如何使用您的应用程序(即在大街上,在实验室中,有或没有学习,站立/坐着等);距离;用户年龄和能力(例如,儿童和老人,以及残障人士,通常较不灵巧).如果您的应用程序应该完全支持手指跟踪,那么所有这些(是的,还有更多...)都会出现.

What your application is going to do; How the user approaches your application (i.e., on the street, in the lab, with or without learning, standing/seated, etc...); Distances; User age & capabilities (i.e., children and elderly are generally less dexterous, as are those with disabilities). All these (and yes... more) come into if your application should support finger tracking at all.

这篇关于Kinect中的手指跟踪的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆