实施射线拾取 [英] Implementing Ray Picking
问题描述
我有一个使用directx和openGL的渲染器,以及一个3d场景。
如何以独立于平台的方式实现对给定鼠标坐标x和y的拾取?
$ b $如果可以的话,可以通过鼠标指针计算出眼睛的光线并将其与模型相交来完成CPU的拾取。
如果这不是一个选项,我会使用某种类型的ID渲染。指定要选择唯一颜色的每个对象,使用这些颜色渲染对象,最后从鼠标指针下的帧缓冲区中读出颜色。
编辑:如果问题是如何从鼠标坐标构建射线,则需要以下内容:投影矩阵 P 和相机变换 C 。如果鼠标指针的坐标为(x,y),并且视口的大小为(宽度,高度),则沿光线的剪辑空间中的一个位置为:
mouse_clip = [
float(x)* 2 / float(width) - 1,
1 - float(y)* 2 / float(height),
0,
1]
(请注意,我翻转了y轴,因为鼠标坐标的原点通常位于左上角)
以下情况也是如此:
p> mouse_clip = P * C * mouse_worldspace
给出:
pre $ mouse_worldspace = inverse(C)* inverse(P)* mouse_clip
我们现在有:
p = C.position(); //世界空间中相机的原点
n = normalize(mouse_worldspace - p); //在worldspace中从p到鼠标pos的单位向量
I have a renderer using directx and openGL, and a 3d scene. The viewport and the window are of the same dimensions.
How do I implement picking given mouse coordinates x and y in a platform independent way?
If you can, do the picking on the CPU by calculating a ray from the eye through the mouse pointer and intersect it with your models.
If this isn't an option I would go with some type of ID rendering. Assign each object you want to pick a unique color, render the objects with these colors and finally read out the color from the framebuffer under the mouse pointer.
EDIT: If the question is how to construct the ray from the mouse coordinates you need the following: a projection matrix P and the camera transform C. If the coordinates of the mouse pointer is (x, y) and the size of the viewport is (width, height) one position in clip space along the ray is:
mouse_clip = [
float(x) * 2 / float(width) - 1,
1 - float(y) * 2 / float(height),
0,
1]
(Notice that I flipped the y-axis since often the origin of the mouse coordinates are in the upper left corner)
The following is also true:
mouse_clip = P * C * mouse_worldspace
Which gives:
mouse_worldspace = inverse(C) * inverse(P) * mouse_clip
We now have:
p = C.position(); //origin of camera in worldspace
n = normalize(mouse_worldspace - p); //unit vector from p through mouse pos in worldspace
这篇关于实施射线拾取的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!