如何将深度变为颜色? [英] How to mapp Depth to Color ?
问题描述
我是Kinect的初学者,我使用这个设备3天,所以
了解我。
我有这个问题。我有彩色图像阵列和深度图像阵列。我在彩色图像上找到了一些东西(项目)(Xincolor,Yincolor),我知道距离Kinect有多远,所以我需要从Xindepth和Yindepth获得价值Z,但是我需要在
之前改变大小数组深度。 我找到了一些人们使用CoordinateMapper.MapDepthFrameToColorSpace的解决方案。我也尝试使用这个方法,但是这个方法返回我的数组大小与数组深度相等。和 在返回的数组中,奇怪的是
值: X:175.26976 Y:-116.437988。
$
有人可以帮助我吗?
而且,对不起我的英语。
I'm beginner in Kinect, I work with this device 3 days,, so be
understanding for me.
I have that problem. I have array with color image and array with depth image. I have found something (item) on color image (Xincolor, Yincolor) and i'd know how far it is from Kinect, so I need get value Z, from Xindepth and Yindepth, Buth before I need to
change size of array depth. I found some solution where people used CoordinateMapper.MapDepthFrameToColorSpace. I try use this method also, buth this method return me array with size is equal size of array depth. And in returned array, are strange
value : X:175.26976 Y:-116.437988.
Some one can help me ?
And, sorry for my english.
推荐答案
此主题可能会帮助您更好地理解:
this thread might help you understand better:
基本上该函数的结果为您提供了查找点,其中深度可能有也可能没有色点,因为相机空间没有重叠或可能落在红外阴影上。关键是要确认点落在颜色框内
范围内的偏移范围内并获取该颜色点值。
basically the result of the function provides you lookup points where the depth may or may not have a color point because of the camera space doesn't overlay or may fall onto a ir shadow. The key is to confirm the points fall within range of an offset within the color frame and take that color point value.
至于确定特定深度点的位置色点,更容易获取x,y偏移并将其映射到深度:
MapColorFrameToCameraSpace 或
MapColorFrameToDepthSpace 这会更快一些,因为您处理的是一组更小的数据。 CameraSpace将在三个维度上为您提供更多与相机的真实距离。
As for determining what the depth point is for a particular color point, it is easier to take an x,y offset and map that to depth: MapColorFrameToCameraSpace or MapColorFrameToDepthSpace this will be a bit faster as you are dealing with a much smaller set of data. CameraSpace will give you more of a real-world distance from the camera in all 3 dimensions.
这篇关于如何将深度变为颜色?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!