是否有可能在OpenGL中单击立方体的哪个表面? [英] Is it possible get which surface of cube will be click in OpenGL?

查看:79
本文介绍了是否有可能在OpenGL中单击立方体的哪个表面?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经创建了一个多维数据集及其完美旋转.我的任务是单击哪个旋转的多维数据集.例如,如果您单击立方体中曲面的红色,那么我会赢,但是我找不到立方体单击的曲面视图,

I already create a cube and its spin perfectly. And my task is which spinning cube you click. ex, if you click on red color of the surface in a cube then, I will win, but I can not able to find surface view of click of cube,

已编辑


我想要接触的地方.


i want surface of where i touch.

这是我的渲染器代码:

public void onDrawFrame(GL10 arg0) {
    //              GLES20.glEnable(GLES20.GL_TEXTURE_CUBE_MAP);
    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
    GLES20.glUseProgram(iProgId);

    cubeBuffer.position(0);
    GLES20.glVertexAttribPointer(iPosition, 3, GLES20.GL_FLOAT, false, 0, cubeBuffer);
    GLES20.glEnableVertexAttribArray(iPosition);

    texBuffer.position(0);
    GLES20.glVertexAttribPointer(iTexCoords, 3, GLES20.GL_FLOAT, false, 0, texBuffer);
    GLES20.glEnableVertexAttribArray(iTexCoords);

    GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
    GLES20.glBindTexture(GLES20.GL_TEXTURE_CUBE_MAP, iTexId);
    GLES20.glUniform1i(iTexLoc, 0);

    // Draw a cube.
    // Translate the cube into the screen.
    Matrix.setIdentityM(m_fIdentity, 0);
    //               Matrix.translateM(m_fIdentity, 0, 0.0f, 0.8f, -3.5f);

    // Set a matrix that contains the current rotation.
    Matrix.setIdentityM(mCurrentRotation, 0);
    Matrix.rotateM(mCurrentRotation, 0, mDeltaX, 1.0f, 0.0f, 0.0f);
    Matrix.rotateM(mCurrentRotation, 0, mDeltaY, 0.0f, 1.0f, 0.0f);
    Matrix.rotateM(mCurrentRotation, 0, mDeltaZ, 0.0f, 0.0f, 1.0f);

    mDeltaX = 0.0f;
    mDeltaY = 0.0f;
    mDeltaZ = 0.0f;

    // Multiply the current rotation by the accumulated rotation, and then set the accumulated
    // rotation to the result.
    Matrix.multiplyMM(mTemporaryMatrix, 0, mCurrentRotation, 0, mAccumulatedRotation, 0);
    System.arraycopy(mTemporaryMatrix, 0, mAccumulatedRotation, 0, 16);

    // Rotate the cube taking the overall rotation into account.
    Matrix.multiplyMM(mTemporaryMatrix, 0, m_fIdentity, 0, mAccumulatedRotation, 0);
    System.arraycopy(mTemporaryMatrix, 0, m_fIdentity, 0, 16);

    Matrix.multiplyMM(m_fVPMatrix, 0, m_fViewMatrix, 0, m_fIdentity, 0);
    Matrix.multiplyMM(m_fVPMatrix, 0, m_fProjMatrix, 0, m_fVPMatrix, 0);

    Ray ray = null;
    if (mDeltaX != -99) {
        ray = new Ray(arg0, width, height, mDeltaX, mDeltaY);
    }

    mDeltaX = -99;
    //              Matrix.translateM(m_fVPMatrix, 0, 0, 0, 1);
    GLES20.glUniformMatrix4fv(iVPMatrix, 1, false, m_fVPMatrix, 0);

    GLES20.glDrawElements(GLES20.GL_TRIANGLES, 36, GLES20.GL_UNSIGNED_SHORT, indexBuffer);
    //              GLES20.glDisable(GLES20.GL_TEXTURE_CUBE_MAP);
}

推荐答案

是否可以在OpenGL中单击立方体的哪个表面?

Is it possible get which surface of cube will be click in OpenGL?

在渲染中,场景的每个对象通常由视图矩阵和投影矩阵进行变换.虽然视图矩阵描述了查看场景的方向和位置,但是投影矩阵描述了从场景的3D点到视口的2D点的映射.投影矩阵从视图空间转换为剪辑空间,并将剪辑空间中的坐标转换为范围为(-1,-1,-1)至(1、1、1)的归一化设备坐标(NDC)通过除以剪辑坐标的w分量.
如果必须通过在视口上选择一个点来找到场景表面上的一个点,则必须找到一种方法来做相反的事情.
识别物体表面的常用方法是定义一条具有起点和方向的射线并找到首先被该射线撞击的表面.视线就是这样的光线,因为它具有起点和方向,但是如何根据视线定义光线取决于场景的投影类型.

In a rendering, each object of the scene usually is transformed by the view matrix and the projection matrix. While the view matrix describes the direction and position from which the scene is viewed, the projection matrix describes the mapping from 3D points of a scene, to 2D points of the viewport. The projection matrix transforms from view space to the clip space, and the coordinates in the clip space are transformed to the normalized device coordinates (NDC) in the range (-1, -1, -1) to (1, 1, 1) by dividing with the w component of the clip coordinates.
If a point on a surface of the scene has to be found, by selecting a point on the view port, then a way has to be found to do the opposite.
A common way to identify a surface of an object is to define a ray with a starting point and a direction and to find the surface which is first hit by the ray. The line of sight is such a ray, because it has a start point and a direction, but how to define a ray by the line of sight depends on the projection type of the scene.

正交投影时,眼部空间中的坐标被线性映射到规范化的设备坐标,在透视投影中,将视锥台(截棱锥)中的眼睛空间坐标映射到一个立方体(规范化的设备坐标).
在这两种情况下,都必须先将视口位置转换为规范化(XY)设备坐标,范围在(-1,-1)到(1,1)之间.这是一个简单的线性映射:

While at Orthographic Projection the coordinates in the eye space are linearly mapped to normalized device coordinates, at Perspective Projection the eye space coordinates in the camera frustum (a truncated pyramid) are mapped to a cube (the normalized device coordinates).
In both cases first the viewport position has to be converted to normalize (XY) device coordinates, in the range from (-1,-1) to (1,1). This is a simple linear mapping:

w = with of the viewport
h = height of the viewport
x = X position of the mouse
y = Y position ot the mouse

ndc_x = 2.0 * x/w - 1.0;
ndc_y = 1.0 - 2.0 * y/h; // invert Y axis

在视图空间中的正交投影处定义一条视线

射线的起点可以通过使用逆投影矩阵通过在近平面(z = 0)上的标准化设备坐标中转换视口的点来计算.

The start point of the ray can be calculated by transforming the point the viewport in normalized device coordinates on the near plane (z = 0), with the inverse projection matrix.

R0_view = inverse( projection-matrix ) * (ndc_x, ndc_y, 0.0, 1.0)

视线方向是进入视口(0、0,-1)的方向.

The direction of the line of sight is the direction into the view port (0, 0, -1).

D_view = (0.0, 0.0, -1.0)

在视图空间中的透视投影处定义一条视线

视线的起点是摄像头位置,它是视图空间中的(0,0,0).

The start point of the line of sight is the camera position, which is (0, 0, 0) in view space.

R0_view = (0.0, 0.0, 0.0)

通过逆投影矩阵,通过在归一化设备坐标中转换射线上的任何点,可以计算出视线的方向.

The direction of the line of sight can be calculated by transforming any point on the ray in normalized device coordinates, by the inverse projection matrix.

D_view = normalize( inverse( projection-matrix ) * (ndc_x, ndc_y, 0.0, 1.0) )

从视图坐标转换为世界坐标

要从视图空间转换为世界空间,必须通过逆视图矩阵转换视图空间坐标.

Convert from view coordinates to world coordinates

To convert from the view space to the world space, the view space coordinates have to be transformed by the inverse view matrix.

R0_world = inverse( view-matrix ) * R0_view
R1_world = inverse( view-matrix ) * (R0_view + D_view)
D_world  = normalize(R1_world - R0_world)

找到射线与图元的交点

要找到被射线撞击的表面,必须计算每个表面(原始)与射线的交点与射线起点的距离. (在射线方向上)距离最小的表面被击中.

Find the intersection point of a ray with a primitive

To find the surface which is hit by the ray, the distance of the intersection point of each surface (primitive) with the ray and the start point of the ray has to be calculated. The surface which has the lowest distance (in the ray direction), is hit.

要找到射线与三角形图元的交点的距离,必须执行以下步骤:

To find the distance of the intersection point of a ray with a triangle primitive, the following steps has to be done:

  1. 找到射线与由三角形图元的3个点定义的平面的交点.
  2. 计算光线的交点和起点之间的距离.
  3. 测试交点是否在射线方向(不是相反方向)
  4. 测试相交点是否在三角形轮廓内或三角形轮廓上.

找到相交点和相交距离:

一个平面由范数矢量(NV)和该平面上的一个点(P0)定义.如果由三个点ABC给出三角形,则可以按以下方式计算平面:

A plane is defined by a norm vector (NV) and a point on the plane (P0). If a triangle is given by the 3 points A, B and C, the plane can be calculated as follows:

P0 = A
NV = normalize( cross( B-A, C-A ) )

通过代入射线方程,可以计算出射线与平面的交点
P_isect = dist * D + R0进入平面dot( P_isect - P0, NV ) == 0的方程式中.
如下:

The intersection of a ray with a plane is calculated by substituting the equation of the ray
P_isect = dist * D + R0 into the equation of the plane dot( P_isect - P0, NV ) == 0.
It follows:

P_isect    = R0 + D * dist_isect
dist_isect = dot( P0 - R0, NV ) / dot( D, NV ) 

测试交点是否在射线的方向上:

如果`dist_isect大于或等于0.0,则交点指向射线方向.

The intersection point is in the direction of the ray, if `dist_isect is greater or equal 0.0.

测试交点是否在三角形轮廓内或三角形轮廓上

要找出一个点是否在三角形内,是否必须进行测试,即从拐角点到相交点的线是否在连接到拐角点的支腿之间:

To find out, if a point is inside a triangle, has to be tested, if the line from a corner point to the intersection point is between the to legs which are connect to the corner point:

bool PointInOrOn( P1, P2, A, B )
{
    CP1 = cross( B - A, P1 - A )
    CP2 = cross( B - A, P2 - A )
    return dot( CP1, CP2 ) >= 0
}

bool PointInOrOnTriangle( P, A, B, C )
{
    return PointInOrOn( P, A, B, C ) &&
           PointInOrOn( P, B, C, A ) &&
           PointInOrOn( P, C, A, B )
} 


为了解决该问题,以下问题的答案也将很有意义:


The answers to the following questions will be of interest too, to solve the issue:

  • How to recover view space position given view space depth value and ndc xy
  • Mouse picking miss
  • How to render depth linearly in modern OpenGL with gl_FragCoord.z in fragment shader?
  • Ray Sphere Intersections in OpenGL

请参见WebGL示例,该示例演示了该算法:

See the WebGL example, which demonstrate the algorithm:

glArrayType = typeof Float32Array !="undefined" ? Float32Array : ( typeof WebGLFloatArray != "undefined" ? WebGLFloatArray : Array );

function IdentityMat44() {
  var m = new glArrayType(16);
  m[0]  = 1; m[1]  = 0; m[2]  = 0; m[3]  = 0;
  m[4]  = 0; m[5]  = 1; m[6]  = 0; m[7]  = 0;
  m[8]  = 0; m[9]  = 0; m[10] = 1; m[11] = 0;
  m[12] = 0; m[13] = 0; m[14] = 0; m[15] = 1;
  return m;
};

function RotateAxis(matA, angRad, axis) {
    var aMap = [ [1, 2], [2, 0], [0, 1] ];
    var a0 = aMap[axis][0], a1 = aMap[axis][1]; 
    var sinAng = Math.sin(angRad), cosAng = Math.cos(angRad);
    var matB = new glArrayType(16);
    for ( var i = 0; i < 16; ++ i ) matB[i] = matA[i];
    for ( var i = 0; i < 3; ++ i ) {
        matB[a0*4+i] = matA[a0*4+i] * cosAng + matA[a1*4+i] * sinAng;
        matB[a1*4+i] = matA[a0*4+i] * -sinAng + matA[a1*4+i] * cosAng;
    }
    return matB;
}

function Cross( a, b ) { return [ a[1] * b[2] - a[2] * b[1], a[2] * b[0] - a[0] * b[2], a[0] * b[1] - a[1] * b[0], 0.0 ]; }
function Dot( a, b ) { return a[0]*b[0] + a[1]*b[1] + a[2]*b[2]; }
function Normalize( v ) {
    var len = Math.sqrt( v[0] * v[0] + v[1] * v[1] + v[2] * v[2] );
    return [ v[0] / len, v[1] / len, v[2] / len ];
}

function PointInOrOn( P1, P2, A, B )
{
    CP1 = Cross( [ B[0]-A[0], B[1]-A[1], B[2]-A[2] ], [ P1[0]-A[0], P1[1]-A[1], P1[2]-A[2] ] )
    CP2 = Cross( [ B[0]-A[0], B[1]-A[1], B[2]-A[2] ], [ P2[0]-A[0], P2[1]-A[1], P2[2]-A[2] ] )
    return Dot( CP1, CP2 ) >= 0;
}

function PointInOrOnTriangle( P, A, B, C )
{
    var isInA = PointInOrOn( P, A, B, C );
    var isInB = PointInOrOn( P, B, C, A );
    var isInC = PointInOrOn( P, C, A, B );
    return isInA && isInB && isInC;
} 

vec4_add = function( a, b ) { return [ a[0]+b[0], a[1]+b[1], a[2]+b[2], a[3]+b[3] ]; }
vec4_sub = function( a, b ) { return [ a[0]-b[0], a[1]-b[1], a[2]-b[2], a[3]-b[3] ]; }
vec4_mul = function( a, b ) { return [ a[0]*b[0], a[1]*b[1], a[2]*b[2], a[3]*b[3] ]; }
vec4_scale = function( a, s ) { return [ a[0]*s, a[1]*s, a[2]*s, a[3]*s ]; }

mat44_inverse = function( m ) {

    var Coef00 = m[2*4+2] * m[3*4+3] - m[3*4+2] * m[2*4+3];
    var Coef02 = m[1*4+2] * m[3*4+3] - m[3*4+2] * m[1*4+3];
    var Coef03 = m[1*4+2] * m[2*4+3] - m[2*4+2] * m[1*4+3];    
    var Coef04 = m[2*4+1] * m[3*4+3] - m[3*4+1] * m[2*4+3];
    var Coef06 = m[1*4+1] * m[3*4+3] - m[3*4+1] * m[1*4+3];
    var Coef07 = m[1*4+1] * m[2*4+3] - m[2*4+1] * m[1*4+3];   
    var Coef08 = m[2*4+1] * m[3*4+2] - m[3*4+1] * m[2*4+2];
    var Coef10 = m[1*4+1] * m[3*4+2] - m[3*4+1] * m[1*4+2];
    var Coef11 = m[1*4+1] * m[2*4+2] - m[2*4+1] * m[1*4+2];   
    var Coef12 = m[2*4+0] * m[3*4+3] - m[3*4+0] * m[2*4+3];
    var Coef14 = m[1*4+0] * m[3*4+3] - m[3*4+0] * m[1*4+3];
    var Coef15 = m[1*4+0] * m[2*4+3] - m[2*4+0] * m[1*4+3];   
    var Coef16 = m[2*4+0] * m[3*4+2] - m[3*4+0] * m[2*4+2];
    var Coef18 = m[1*4+0] * m[3*4+2] - m[3*4+0] * m[1*4+2];
    var Coef19 = m[1*4+0] * m[2*4+2] - m[2*4+0] * m[1*4+2];   
    var Coef20 = m[2*4+0] * m[3*4+1] - m[3*4+0] * m[2*4+1];
    var Coef22 = m[1*4+0] * m[3*4+1] - m[3*4+0] * m[1*4+1];
    var Coef23 = m[1*4+0] * m[2*4+1] - m[2*4+0] * m[1*4+1];
      
    var Fac0 = [Coef00, Coef00, Coef02, Coef03];
    var Fac1 = [Coef04, Coef04, Coef06, Coef07];
    var Fac2 = [Coef08, Coef08, Coef10, Coef11];
    var Fac3 = [Coef12, Coef12, Coef14, Coef15];
    var Fac4 = [Coef16, Coef16, Coef18, Coef19];
    var Fac5 = [Coef20, Coef20, Coef22, Coef23];
      
    var Vec0 = [ m[1*4+0], m[0*4+0], m[0*4+0], m[0*4+0] ];
    var Vec1 = [ m[1*4+1], m[0*4+1], m[0*4+1], m[0*4+1] ];
    var Vec2 = [ m[1*4+2], m[0*4+2], m[0*4+2], m[0*4+2] ];
    var Vec3 = [ m[1*4+3], m[0*4+3], m[0*4+3], m[0*4+3] ];
      
    var Inv0 = vec4_add( vec4_sub( vec4_mul(Vec1, Fac0), vec4_mul(Vec2, Fac1) ), vec4_mul( Vec3, Fac2 ) );
    var Inv1 = vec4_add( vec4_sub( vec4_mul(Vec0, Fac0), vec4_mul(Vec2, Fac3) ), vec4_mul( Vec3, Fac4 ) );
    var Inv2 = vec4_add( vec4_sub( vec4_mul(Vec0, Fac1), vec4_mul(Vec1, Fac3) ), vec4_mul( Vec3, Fac5 ) );
    var Inv3 = vec4_add( vec4_sub( vec4_mul(Vec0, Fac2), vec4_mul(Vec1, Fac4) ), vec4_mul( Vec2, Fac5 ) );
      
    var SignA = [+1.0, -1.0, +1.0, -1.0];
    var SignB = [-1.0, +1.0, -1.0, +1.0];
    var Inverse = [ vec4_mul(Inv0, SignA), vec4_mul(Inv1, SignB), vec4_mul(Inv2, SignA), vec4_mul(Inv3, SignB) ];
      
    var Row0 = [Inverse[0][0], Inverse[1][0], Inverse[2][0], Inverse[3][0] ];
      
    var Dot0 = [Row0[0], Row0[1], Row0[2], Row0[3] ];
    Dot0 = vec4_mul( Dot0, [ m[0], m[1], m[2], m[3] ] );
    var Dot1 = (Dot0[0] + Dot0[1]) + (Dot0[2] + Dot0[3]);
      
    var OneOverDeterminant = 1 / Dot1;

    var res = IdentityMat44();  
    for ( var inx1 = 0; inx1 < 4; inx1 ++ ) {
        for ( var inx2 = 0; inx2 < 4; inx2 ++ )
            res[inx1*4+inx2] = Inverse[inx1][inx2] * OneOverDeterminant;
    }
    return res;
}


Transform = function(vec, mat) {
    var h = [
        vec[0] * mat[0*4+0] + vec[1] * mat[1*4+0] + vec[2] * mat[2*4+0] + mat[3*4+0],
        vec[0] * mat[0*4+1] + vec[1] * mat[1*4+1] + vec[2] * mat[2*4+1] + mat[3*4+1],
        vec[0] * mat[0*4+2] + vec[1] * mat[1*4+2] + vec[2] * mat[2*4+2] + mat[3*4+2],
        vec[0] * mat[0*4+3] + vec[1] * mat[1*4+3] + vec[2] * mat[2*4+3] + mat[3*4+3] ]
    if ( h[3] == 0.0 )
        return [0, 0, 0]
    return [ h[0]/h[3], h[1]/h[3], h[2]/h[3] ];
}

var Camera = {};
Camera.create = function() {
    this.pos    = [0, 3, 0.0];
    this.target = [0, 0, 0];
    this.up     = [0, 0, 1];
    this.fov_y  = 90;
    this.vp     = [800, 600];
    this.near   = 0.5;
    this.far    = 100.0;
}
Camera.Perspective = function() {
    var fn = this.far + this.near;
    var f_n = this.far - this.near;
    var r = this.vp[0] / this.vp[1];
    var t = 1 / Math.tan( Math.PI * this.fov_y / 360 );
    var m = IdentityMat44();
    m[0]  = t/r; m[1]  = 0; m[2]  =  0;                              m[3]  = 0;
    m[4]  = 0;   m[5]  = t; m[6]  =  0;                              m[7]  = 0;
    m[8]  = 0;   m[9]  = 0; m[10] = -fn / f_n;                       m[11] = -1;
    m[12] = 0;   m[13] = 0; m[14] = -2 * this.far * this.near / f_n; m[15] =  0;
    return m;
}
Camera.LookAt = function() {
    var mz = Normalize( [ this.pos[0]-this.target[0], this.pos[1]-this.target[1], this.pos[2]-this.target[2] ] );
    var mx = Normalize( Cross( this.up, mz ) );
    var my = Normalize( Cross( mz, mx ) );
    var tx = Dot( mx, this.pos );
    var ty = Dot( my, this.pos );
    var tz = Dot( [-mz[0], -mz[1], -mz[2]], this.pos ); 
    var m = IdentityMat44();
    m[0]  = mx[0]; m[1]  = my[0]; m[2]  = mz[0]; m[3]  = 0;
    m[4]  = mx[1]; m[5]  = my[1]; m[6]  = mz[1]; m[7]  = 0;
    m[8]  = mx[2]; m[9]  = my[2]; m[10] = mz[2]; m[11] = 0;
    m[12] = tx;    m[13] = ty;    m[14] = tz;    m[15] = 1; 
    return m;
} 

// shader program object
var ShaderProgram = {};
ShaderProgram.Create = function( shaderList, uniformNames ) {
    var shaderObjs = [];
    for ( var i_sh = 0; i_sh < shaderList.length; ++ i_sh ) {
        var shderObj = this.CompileShader( shaderList[i_sh].source, shaderList[i_sh].stage );
        if ( shderObj == 0 )
            return 0;
        shaderObjs.push( shderObj );
    }
    var progObj = this.LinkProgram( shaderObjs )
    if ( progObj != 0 ) {
        progObj.unifomLocation = {};
        for ( var i_n = 0; i_n < uniformNames.length; ++ i_n ) {
            var name = uniformNames[i_n];
            progObj.unifomLocation[name] = gl.getUniformLocation( progObj, name );
        }
    }
    return progObj;
}
ShaderProgram.Use = function( progObj ) { gl.useProgram( progObj ); } 
ShaderProgram.SetUniformInt = function( progObj, name, val ) { gl.uniform1i( progObj.unifomLocation[name], val ); }
ShaderProgram.SetUniform2i = function( progObj, name, arr ) { gl.uniform2iv( progObj.unifomLocation[name], arr ); }
ShaderProgram.SetUniformFloat = function( progObj, name, val ) { gl.uniform1f( progObj.unifomLocation[name], val ); }
ShaderProgram.SetUniform2f = function( progObj, name, arr ) { gl.uniform2fv( progObj.unifomLocation[name], arr ); }
ShaderProgram.SetUniform3f = function( progObj, name, arr ) { gl.uniform3fv( progObj.unifomLocation[name], arr ); }
ShaderProgram.SetUniformMat44 = function( progObj, name, mat ) { gl.uniformMatrix4fv( progObj.unifomLocation[name], false, mat ); }
ShaderProgram.CompileShader = function( source, shaderStage ) {
    var shaderScript = document.getElementById(source);
    if (shaderScript) {
      source = "";
      var node = shaderScript.firstChild;
      while (node) {
        if (node.nodeType == 3) source += node.textContent;
        node = node.nextSibling;
      }
    }
    var shaderObj = gl.createShader( shaderStage );
    gl.shaderSource( shaderObj, source );
    gl.compileShader( shaderObj );
    var status = gl.getShaderParameter( shaderObj, gl.COMPILE_STATUS );
    if ( !status ) alert(gl.getShaderInfoLog(shaderObj));
    return status ? shaderObj : 0;
} 
ShaderProgram.LinkProgram = function( shaderObjs ) {
    var prog = gl.createProgram();
    for ( var i_sh = 0; i_sh < shaderObjs.length; ++ i_sh )
        gl.attachShader( prog, shaderObjs[i_sh] );
    gl.linkProgram( prog );
    status = gl.getProgramParameter( prog, gl.LINK_STATUS );
    if ( !status ) alert("Could not initialise shaders");
    gl.useProgram( null );
    return status ? prog : 0;
}
        
function drawScene(){

    var canvas = document.getElementById( "ogl-canvas" );
    Camera.create();
    Camera.vp = [canvas.width, canvas.height];
    var currentTime = Date.now();   
    var deltaMS = currentTime - startTime;
        
    gl.viewport( 0, 0, canvas.width, canvas.height );
    gl.enable( gl.DEPTH_TEST );
    gl.clearColor( 0.0, 0.0, 0.0, 1.0 );
    gl.clear( gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT );
    
    var prjMat = Camera.Perspective();
    var viewMat = Camera.LookAt();

    // set up draw shader
    ShaderProgram.Use( progDraw );
    ShaderProgram.SetUniformMat44( progDraw, "u_projectionMat44", prjMat );
    ShaderProgram.SetUniformMat44( progDraw, "u_viewMat44", viewMat );
    var modelMat = IdentityMat44()
    modelMat = RotateAxis( modelMat, CalcAng( currentTime, 13.0 ), 0 );
    modelMat = RotateAxis( modelMat, CalcAng( currentTime, 17.0 ), 1 );
    ShaderProgram.SetUniformMat44( progDraw, "u_modelMat44", modelMat );
    
    // draw scene
    bufObj = bufCube;
    gl.enableVertexAttribArray( progDraw.inPos );
    gl.enableVertexAttribArray( progDraw.inCol );
    gl.bindBuffer( gl.ARRAY_BUFFER, bufObj.pos );
    gl.vertexAttribPointer( progDraw.inPos, 3, gl.FLOAT, false, 0, 0 );
    gl.bindBuffer( gl.ARRAY_BUFFER, bufObj.col );
    gl.vertexAttribPointer( progDraw.inCol, 3, gl.FLOAT, false, 0, 0 );
    gl.bindBuffer( gl.ELEMENT_ARRAY_BUFFER, bufObj.inx );
    gl.drawElements( gl.TRIANGLES, bufObj.inxLen, gl.UNSIGNED_SHORT, 0 );
    gl.disableVertexAttribArray( progDraw.pos );
    gl.disableVertexAttribArray( progDraw.col );

    var newColor = "#000000";
    var pos = [-1, -1];
    if (mousePos[0] > 0 && mousePos[1] > 0 ) {
        var pos = [2.0 * mousePos[0] / canvas.width - 1.0, 1.0 - 2.0 * mousePos[1] / canvas.height];

        var invPrjMat = mat44_inverse( prjMat )
        var invViewMat = mat44_inverse( viewMat )
        var invModelMat = mat44_inverse( modelMat )

        var viewP1 = Transform([pos[0],pos[1],-1.0], invPrjMat);
        
        var R0 = Transform(Transform([0,0,0], invViewMat), invModelMat);
        var R1 = Transform(Transform(viewP1, invViewMat), invModelMat);
        var D = Normalize( [ R1[0]-R0[0], R1[1]-R0[1], R1[2]-R0[2] ] );

        var minDist = 100000;
        for ( it = 0; it < cubeInxData.length; it = it + 3 )
        {
            var trI = [ cubeInxData[it+0], cubeInxData[it+1], cubeInxData[it+2] ]
            var A = [ cubePosData[trI[0]*3+0], cubePosData[trI[0]*3+1], cubePosData[trI[0]*3+2] ];
            var B = [ cubePosData[trI[1]*3+0], cubePosData[trI[1]*3+1], cubePosData[trI[1]*3+2] ];
            var C = [ cubePosData[trI[2]*3+0], cubePosData[trI[2]*3+1], cubePosData[trI[2]*3+2] ];
            
            P0 = A;
            NV = Cross( [ B[0]-A[0], B[1]-A[1], B[2]-A[2] ], [ C[0]-A[0], C[1]-A[1], C[2]-A[2] ] );
            NV = Normalize( NV );

            dist_isect = Dot( [ P0[0]-R0[0], P0[1]-R0[1], P0[2]-R0[2] ], NV ) / Dot( D, NV );
            if ( dist_isect < 0.0 )
                continue;
            P_isect    = [ R0[0] + D[0] * dist_isect, R0[1] + D[1] * dist_isect, R0[2] + D[2] * dist_isect ];

            if ( PointInOrOnTriangle( P_isect, A, B, C ) )
            {
                var col = [ 
                    Math.floor(cubeColData[trI[0]*3+0]*255),
                    Math.floor(cubeColData[trI[0]*3+1]*255),
                    Math.floor(cubeColData[trI[0]*3+2]*255) ];
                h0 = col[0].toString(16); if( h0.length < 2 ) h0 = "0" + h0;
                h1 = col[1].toString(16); if( h1.length < 2 ) h1 = "0" + h1;
                h2 = col[2].toString(16); if( h2.length < 2 ) h2 = "0" + h2;
                if ( dist_isect < minDist ) {
                    minDist = dist_isect;
                    newColor = "#" + h0 + h1 + h2;
                } 
            }
        }
    }
    document.getElementById( "color" ).value = newColor;
    document.getElementById( "mouseX" ).innerHTML = pos[0];
    document.getElementById( "mouseY" ).innerHTML = pos[1];
}

var startTime;
function Fract( val ) { 
    return val - Math.trunc( val );
}
function CalcAng( currentTime, intervall ) {
    return Fract( (currentTime - startTime) / (1000*intervall) ) * 2.0 * Math.PI;
}
function CalcMove( currentTime, intervall, range ) {
    var pos = self.Fract( (currentTime - startTime) / (1000*intervall) ) * 2.0
    var pos = pos < 1.0 ? pos : (2.0-pos)
    return range[0] + (range[1] - range[0]) * pos;
}    
function EllipticalPosition( a, b, angRag ) {
    var a_b = a * a - b * b
    var ea = (a_b <= 0) ? 0 : Math.sqrt( a_b );
    var eb = (a_b >= 0) ? 0 : Math.sqrt( -a_b );
    return [ a * Math.sin( angRag ) - ea, b * Math.cos( angRag ) - eb, 0 ];
}

var mousePos = [-1, -1];
var sliderScale = 100.0
var gl;
var progDraw;
var bufCube = {};
var bufTorus = {};
var cubePosData = [];
var cubeColData = [];
var cubeInxData = [];
function sceneStart() {

    var canvas = document.getElementById( "ogl-canvas");
    var vp = [canvas.width, canvas.height];
    gl = canvas.getContext( "experimental-webgl" );
    if ( !gl )
      return;

    progDraw = ShaderProgram.Create( 
      [ { source : "draw-shader-vs", stage : gl.VERTEX_SHADER },
        { source : "draw-shader-fs", stage : gl.FRAGMENT_SHADER }
      ],
      [ "u_projectionMat44", "u_viewMat44", "u_modelMat44" ] );
    progDraw.inPos = gl.getAttribLocation( progDraw, "inPos" );
    progDraw.inCol = gl.getAttribLocation( progDraw, "inCol" );
    if ( progDraw == 0 )
        return;

    // create cube
    var cubePos = [
        -1.0, -1.0,  1.0,  1.0, -1.0,  1.0,  1.0,  1.0,  1.0, -1.0,  1.0,  1.0,
        -1.0, -1.0, -1.0,  1.0, -1.0, -1.0,  1.0,  1.0, -1.0, -1.0,  1.0, -1.0 ];
    var cubeCol = [ 1.0, 0.0, 0.0, 1.0, 0.5, 0.0, 1.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0 ];
    var cubeHlpInx = [ 0, 1, 2, 3, 1, 5, 6, 2, 5, 4, 7, 6, 4, 0, 3, 7, 3, 2, 6, 7, 1, 0, 4, 5 ]; 
    for ( var i = 0; i < cubeHlpInx.length; ++ i ) {
        cubePosData.push( cubePos[cubeHlpInx[i]*3], cubePos[cubeHlpInx[i]*3+1], cubePos[cubeHlpInx[i]*3+2] );
    }
    for ( var is = 0; is < 6; ++ is ) {
        for ( var ip = 0; ip < 4; ++ ip ) {
           cubeColData.push( cubeCol[is*3], cubeCol[is*3+1], cubeCol[is*3+2] ); 
        }
    }
    for ( var i = 0; i < cubeHlpInx.length; i += 4 ) {
        cubeInxData.push( i, i+1, i+2, i, i+2, i+3 );
    }
    bufCube.pos = gl.createBuffer();
    gl.bindBuffer( gl.ARRAY_BUFFER, bufCube.pos );
    gl.bufferData( gl.ARRAY_BUFFER, new Float32Array( cubePosData ), gl.STATIC_DRAW );
    bufCube.col = gl.createBuffer();
    gl.bindBuffer( gl.ARRAY_BUFFER, bufCube.col );
    gl.bufferData( gl.ARRAY_BUFFER, new Float32Array( cubeColData ), gl.STATIC_DRAW );
    bufCube.inx = gl.createBuffer();
    gl.bindBuffer( gl.ELEMENT_ARRAY_BUFFER, bufCube.inx );
    gl.bufferData( gl.ELEMENT_ARRAY_BUFFER, new Uint16Array( cubeInxData ), gl.STATIC_DRAW );
    bufCube.inxLen = cubeInxData.length;

    startTime = Date.now();
    setInterval(drawScene, 50);
}

(function() {
    document.onmousemove = handleMouseMove;
    function handleMouseMove(event) {
        var dot, eventDoc, doc, body, pageX, pageY;

        event = event || window.event; // IE-ism

        if (event.pageX == null && event.clientX != null) {
            eventDoc = (event.target && event.target.ownerDocument) || document;
            doc = eventDoc.documentElement;
            body = eventDoc.body;

            event.pageX = event.clientX +
              (doc && doc.scrollLeft || body && body.scrollLeft || 0) -
              (doc && doc.clientLeft || body && body.clientLeft || 0);
            event.pageY = event.clientY +
              (doc && doc.scrollTop  || body && body.scrollTop  || 0) -
              (doc && doc.clientTop  || body && body.clientTop  || 0 );
        }

        var canvas = document.getElementById( "ogl-canvas");
        var x = event.pageX - canvas.offsetLeft;
        var y = event.pageY - canvas.offsetTop;
        mousePos = [-1, -1];
        if ( x >= 0 && x < canvas.width && y >= 0 && y < canvas.height ) {
            mousePos = [x, y]; 
        }
    }
})();

<script id="draw-shader-vs" type="x-shader/x-vertex">
precision mediump float;

attribute vec3 inPos;
attribute vec3 inCol;

varying vec3 vertCol;

uniform mat4 u_projectionMat44;
uniform mat4 u_viewMat44;
uniform mat4 u_modelMat44;

void main()
{
    vertCol       = inCol;
    vec4 modelPos = u_modelMat44 * vec4( inPos, 1.0 );
    vec4 viewPos  = u_viewMat44 * modelPos;
    gl_Position   = u_projectionMat44 * viewPos;
}
</script>

<script id="draw-shader-fs" type="x-shader/x-fragment">
precision mediump float;

varying vec3 vertCol;

void main()
{
    gl_FragColor = vec4( vertCol.rgb, 1.0 );
}
</script>

<body onload="sceneStart();">
    <div style="margin-left: 260px;">
        <div style="float: right; width: 100%; background-color: #CCF;">
            <form name="inputs">
                <table>
                    <tr> <td> <input type="color" value="#000000" id="color" disabled></td> </tr> 
                    <tr> <td> <span id="mouseX">0</span> </td> </tr>
                    <tr> <td> <span id="mouseY">0</span> </td> </tr>
                </table>
            </form>
        </div>
        <div style="float: right; width: 260px; margin-left: -260px;">
            <canvas id="ogl-canvas" style="border: none;" width="256" height="256"></canvas>
        </div>
        <div style="clear: both;"></div>
    </div>
</body>

这篇关于是否有可能在OpenGL中单击立方体的哪个表面?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆