对于OpenGL ES 2.0的着色器漫:带摄像头的移动光线的变化(Vuforia在Android) [英] Diffuse shader for OpenGL ES 2.0: Light changes with camera movement (Vuforia on Android)

查看:550
本文介绍了对于OpenGL ES 2.0的着色器漫:带摄像头的移动光线的变化(Vuforia在Android)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

作为一个起点,我用的是Vuforia(第4版)呼吁MultiTargets样本跟踪摄像头输入的3D物理魔方,并与沿立方体边缘的黄色网格线扩充。
我想实现的是去除纹理和立方体使用漫射照明面临相反,通过设置我自己的光源位置。

我想这样做的原生Android,我不希望使用统一。

它一直的工作和学习数日艰苦的旅程。这是我第一次用任何一种的OpenGL和OpenGL ES 2.0的工作并不完全很容易让初学者。

所以,我有我的立方体的顶面略高于定位光源。我发现,我能得到的权利,如果我在模型空间计算兰伯特因素弥漫的效果,一切都留在地方,无论我的相机,只有顶面得到任何光线。

但是,当我移动到使用眼睛的空间,就成了怪异的光线似乎遵循我的相机身边。其他脸部拿轻放,不但顶面。我不明白这是为什么。为了测试我已经确定了光的立场是只用距离lightsource在片段着色渲染像素的亮度预期。因此,我在我的lightDirectionEyespace的正确性相当有信心,而我唯一的解释是,一些与法线一定是错误的。但我觉得我跟着解释为正确创建正常的矩阵...

请帮助!

再有就是当然不管这些弥漫计算应该在眼睛空间中进行的问题?会不会有什么坏处,如果我只是做它在模型空间?我怀疑可能是当我后来用更多的车型和灯光,并添加镜面和透明性,它不会工作了,即使我不明白为什么没有

我renderFrame方法:(某些变量名仍包含瓶,它是对象我想后,我得到立方体在下一个红绿灯右)

 私人无效renderFrame()
{
  ShaderFactory.checkGLError(GL检查之前的错误帧渲染);  //清除颜色和深度缓冲
  GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);  //从Vuforia状态和标记渲染部分的开始
  最终状态状态= Renderer.getInstance()开始()。  //明确渲染视频背景
  Renderer.getInstance()drawVideoBackground()。  GLES20.glEnable(GLES20.GL_DEPTH_TEST);
  GLES20.glEnable(GLES20.GL_BLEND);
  GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA,GLES20.GL_ONE_MINUS_SRC_ALPHA);  //我们没有找到任何trackables此帧?
  如果(0!= state.getNumTrackableResults())
  {
    //获取可跟踪:
    TrackableResult结果= NULL;
    最终诠释numResults = state.getNumTrackableResults();    //浏览搜索结果的多目标
    对于(INT J = 0; J< numResults; J ++)
    {
      结果= state.getTrackableResult(J);
      如果(result.isOfType(MultiTargetResult.getClassType()))
        打破;      结果= NULL;
    }    //如果未找到出口
    如果(空==结果)
    {
      //清理和离开
      GLES20.glDisable(GLES20.GL_BLEND);
      GLES20.glDisable(GLES20.GL_DEPTH_TEST);      Renderer.getInstance()结束();
      返回;
    }    最终Matrix44F modelViewMatrix_Vuforia = Tool.convertPose2GLMatrix(result.getPose());
    最终浮动[] = modelViewMatrix modelViewMatrix_Vuforia.getData();    最终浮动[] = modelViewProjection新的浮动[16];
    Matrix.scaleM(modelViewMatrix,0,CUBE_SCALE_X,CUBE_SCALE_Y,CUBE_SCALE_Z);
    Matrix.multiplyMM(modelViewProjection,0,vuforiaAppSession
      。.getProjectionMatrix()的getData(),0,modelViewMatrix,0);    GLES20.glUseProgram(bottleShaderProgramID);    //绘制立方体:
    GLES20.glEnable(GLES20.GL_CULL_FACE);
    GLES20.glCullFace(GLES20.GL_BACK);    GLES20.glVertexAttribPointer(vertexHandleBottle,3,GLES20.GL_FLOAT,假,0,cubeObject.getVertices());
    GLES20.glVertexAttribPointer(normalHandleBottle,3,GLES20.GL_FLOAT,假,0,cubeObject.getNormals());    GLES20.glEnableVertexAttribArray(vertexHandleBottle);
    GLES20.glEnableVertexAttribArray(normalHandleBottle);    //添加光位置和颜色
    最终浮动[] = lightPositionInModelSpace新的浮动[] {0.0,1.1F,0.0,1.0F};
    GLES20.glUniform4f(lightPositionHandleBottle,lightPositionInModelSpace [0],lightPositionInModelSpace [1],
      lightPositionInModelSpace [2],lightPositionInModelSpace [3]);
    GLES20.glUniform3f(lightColorHandleBottle,0.9F,0.9F,0.9F);    //创建normalMatrix照明计算
    最终浮动[] = normalMatrix新的浮动[16];
    Matrix.invertM(normalMatrix,0,modelViewMatrix,0);
    Matrix.transposeM(normalMatrix,0,normalMatrix,0);
    //传递normalMatrix到shader
    GLES20.glUniformMatrix4fv(normalMatrixHandleBottle,1,假,normalMatrix,0);    //提取光照计算摄像机位置(矩阵的最后一列)
    // GLES20.glUniform3f(cameraPositionHandleBottle,normalMatrix [12],normalMatrix [13],normalMatrix [14]);    //组材料特性
    GLES20.glUniform3f(matAmbientHandleBottle,0.0,0.0,0.0);
    GLES20.glUniform3f(matDiffuseHandleBottle,0.1F,0.9F,0.1F);    //模型视图矩阵传递到shader
    GLES20.glUniformMatrix4fv(modelViewMatrixHandleBottle,1,假,modelViewMatrix,0);    //模型视图投影矩阵传递给着色器
    //将转参数必须是假,根据该规范,别的是错误的
    GLES20.glUniformMatrix4fv(mvpMatrixHandleBottle,1,假,modelViewProjection,0);    GLES20.glDrawElements(GLES20.GL_TRIANGLES,
      cubeObject.getNumObjectIndex(),GLES20.GL_UNSIGNED_SHORT,cubeObject.getIndices());    GLES20.glDisable(GLES20.GL_CULL_FACE);    //禁用启用阵列一切都被渲染后
    GLES20.glDisableVertexAttribArray(vertexHandleBottle);
    GLES20.glDisableVertexAttribArray(normalHandleBottle);    ShaderFactory.checkGLError(MultiTargets renderFrame);
  }  GLES20.glDisable(GLES20.GL_BLEND);
  GLES20.glDisable(GLES20.GL_DEPTH_TEST);  Renderer.getInstance()结束();
}

我的顶点着色器:

 属性vec4 vertexPosition;
属性VEC3 vertexNormal;统一mat4 modelViewProjectionMatrix;
统一mat4 modelViewMatrix;
统一mat4 normalMatrix;// 灯光
统一vec4 uLightPosition;
统一VEC3 uLightColor;//材料
统一VEC3 uMatAmbient;
统一VEC3 uMatDiffuse;//传递给片段着色器
不同VEC3 vNormalEyespace;
不同VEC3 vVertexEyespace;
不同vec4 vLightPositionEyespace;
不同VEC3 vNormal;
不同vec4 vVertex;无效的主要()
{
  //我们可以只取一vec4的VEC3(),它会采取前3个表
  vNormalEyespace = VEC3(normalMatrix * vec4(vertexNormal,1.0));
  vNormal = vertexNormal;
  vVertexEyespace = VEC3(modelViewMatrix * vertexPosition);
  vVertex = vertexPosition;  //光源位置
  vLightPositionEyespace = modelViewMatrix * uLightPosition;  GL_POSITION = modelViewProjectionMatrix * vertexPosition;
}

和我的片段着色器:

  precision highp浮动; //显然需要强制同一precision在顶点着色器//灯光
统一vec4 uLightPosition;
统一VEC3 uLightColor;//材料
统一VEC3 uMatAmbient;
统一VEC3 uMatDiffuse;//从顶点着色器
不同VEC3 vNormalEyespace;
不同VEC3 vVertexEyespace;
不同vec4 vLightPositionEyespace;
不同VEC3 vNormal;
不同vec4 vVertex;无效的主要()
{
 VEC3 normalModel =正常化(vNormal);
 VEC3 normalEyespace =正常化(vNormalEyespace);
 VEC3 lightDirectionModel =正常化(uLightPosition.xyz - vVertex.xyz);
 VEC3 lightDirectionEyespace =正常化(vLightPositionEyespace.xyz - vVertexEyespace.xyz); VEC3 ambientTerm = uMatAmbient;
 VEC3 diffuseTerm = uMatDiffuse * uLightColor;
 //通过余弦定律计算兰伯特因素
 浮diffuseLambert = MAX(点(normalEyespace,lightDirectionEyespace),0.0);
 //衰减基于距离的光。
 浮距离=长度(vLightPositionEyespace.xyz - vVertexEyespace.xyz);
 浮diffuseLambertAttenuated = diffuseLambert *(1.0 /(1.0 +(0.01 *距离*距离))); diffuseTerm = diffuseLambertAttenuated * diffuseTerm; gl_FragColor = vec4(ambientTerm + diffuseTerm,1.0);
}


解决方案

我终于解决了所有问题。
有2个问题,可能是对未来的读者的兴趣。


  1. 从官方样片Vuforia CubeObject类(当前版本Vuforia 4)有错误的法线。他们没有所有顶点定义顺序一致。如果您使用从样品中CubeObject,确保正常的定义是正确的面孔相对应的。 Vuforia失败...


  2. 由于怀疑,我normalMatrix被错误地建造。我们不能颠倒,转4×4 modelViewMatrix,首先我们需要从中提取左上角3x3的子矩阵,然后反转,转了。


下面是code,它为我的作品:

 最后Mat3 normalMatrixCube =新Mat3();
  normalMatrixCube.SetFrom4X4(modelViewMatrix);
  normalMatrixCube.invert();
  normalMatrixCube.transpose();

本身这code是没有多大用处,但因为它依赖于一个自定义类Mat3我从这个家伙随机导入,因为没有Android的,也不Vuforia似乎提供任何矩阵类,可以颠倒/ 3x3的转置矩阵。这真的让我怀疑我的理智 - 唯一的code,对于这样一个基本问题工作必须依靠一个自定义矩阵类?也许我只是做错了,我不知道......

As a starting point I use the Vuforia (version 4) sample called MultiTargets which tracks a 3d physical "cube" in the camera feed and augments it with yellow grid lines along the cube edges. What I want to achieve is remove the textures and use diffuse lighting on the cube faces instead, by setting my own light position.

I want to do this on native Android and I do NOT want to use Unity.

It's been a hard journey of several days of work and learning. This is my first time working with OpenGL of any kind, and OpenGL ES 2.0 doesn't exactly make it easy for the beginner.

So I have a light source positioned slightly above the top face of my cube. I found that I can get the diffuse effect right if I compute the lambert factor in model space, everything remains in place regardless of my camera, and only the top face gets any light.

But when I move to using eye space, it becomes weird and the light seems to follow my camera around. Other faces get light, not only the top face. I don't understand why that is. For testing I have made sure that the light position is as expected by only using distance to lightsource for rendering pixel brightness in the fragment shader. Therefore, I'm fairly confident in the correctness of my "lightDirectionEyespace", and my only explanation is that something with the normals must be wrong. But I think I followed the explanations for creating the normal matrix correctly...

Help please!

Then there is of course the question whether those diffuse calculations SHOULD be performed in eye space? Will there be any disadvantages if I just do it in model space? I suspect that probably when I later use more models and lights and add specular and transparency, it will not work anymore, even though I don't see yet why.

My renderFrame method: (some variable names still contain "bottle", which is the object I want to light next after I get the cube right)

private void renderFrame()
{
  ShaderFactory.checkGLError("Check gl errors prior render Frame");

  // Clear color and depth buffer
  GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);

  // Get the state from Vuforia and mark the beginning of a rendering section
  final State state=Renderer.getInstance().begin();

  // Explicitly render the Video Background
  Renderer.getInstance().drawVideoBackground();

  GLES20.glEnable(GLES20.GL_DEPTH_TEST);
  GLES20.glEnable(GLES20.GL_BLEND);
  GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA);

  // Did we find any trackables this frame?
  if(0 != state.getNumTrackableResults())
  {
    // Get the trackable:
    TrackableResult result=null;
    final int numResults=state.getNumTrackableResults();

    // Browse results searching for the MultiTarget
    for(int j=0; j < numResults; j++)
    {
      result=state.getTrackableResult(j);
      if(result.isOfType(MultiTargetResult.getClassType()))
        break;

      result=null;
    }

    // If it was not found exit
    if(null == result)
    {
      // Clean up and leave
      GLES20.glDisable(GLES20.GL_BLEND);
      GLES20.glDisable(GLES20.GL_DEPTH_TEST);

      Renderer.getInstance().end();
      return;
    }

    final Matrix44F modelViewMatrix_Vuforia=Tool.convertPose2GLMatrix(result.getPose());
    final float[] modelViewMatrix=modelViewMatrix_Vuforia.getData();

    final float[] modelViewProjection=new float[16];
    Matrix.scaleM(modelViewMatrix, 0, CUBE_SCALE_X, CUBE_SCALE_Y, CUBE_SCALE_Z); 
    Matrix.multiplyMM(modelViewProjection, 0, vuforiaAppSession
      .getProjectionMatrix().getData(), 0, modelViewMatrix, 0);

    GLES20.glUseProgram(bottleShaderProgramID);

    // Draw the cube:
    GLES20.glEnable(GLES20.GL_CULL_FACE);
    GLES20.glCullFace(GLES20.GL_BACK);

    GLES20.glVertexAttribPointer(vertexHandleBottle, 3, GLES20.GL_FLOAT, false, 0, cubeObject.getVertices());
    GLES20.glVertexAttribPointer(normalHandleBottle, 3, GLES20.GL_FLOAT, false, 0, cubeObject.getNormals());

    GLES20.glEnableVertexAttribArray(vertexHandleBottle);
    GLES20.glEnableVertexAttribArray(normalHandleBottle);

    // add light position and color
    final float[] lightPositionInModelSpace=new float[] {0.0f, 1.1f, 0.0f, 1.0f};
    GLES20.glUniform4f(lightPositionHandleBottle, lightPositionInModelSpace[0], lightPositionInModelSpace[1],
      lightPositionInModelSpace[2], lightPositionInModelSpace[3]);
    GLES20.glUniform3f(lightColorHandleBottle, 0.9f, 0.9f, 0.9f);

    // create the normalMatrix for lighting calculations
    final float[] normalMatrix=new float[16];
    Matrix.invertM(normalMatrix, 0, modelViewMatrix, 0);
    Matrix.transposeM(normalMatrix, 0, normalMatrix, 0);
    // pass the normalMatrix to the shader
    GLES20.glUniformMatrix4fv(normalMatrixHandleBottle, 1, false, normalMatrix, 0);

    // extract the camera position for lighting calculations (last column of matrix)
    // GLES20.glUniform3f(cameraPositionHandleBottle, normalMatrix[12], normalMatrix[13], normalMatrix[14]);

    // set material properties
    GLES20.glUniform3f(matAmbientHandleBottle, 0.0f, 0.0f, 0.0f);
    GLES20.glUniform3f(matDiffuseHandleBottle, 0.1f, 0.9f, 0.1f);

    // pass the model view matrix to the shader 
    GLES20.glUniformMatrix4fv(modelViewMatrixHandleBottle, 1, false, modelViewMatrix, 0);

    // pass the model view projection matrix to the shader
    // the "transpose" parameter must be "false" according to the spec, anything else is an error
    GLES20.glUniformMatrix4fv(mvpMatrixHandleBottle, 1, false, modelViewProjection, 0);

    GLES20.glDrawElements(GLES20.GL_TRIANGLES,
      cubeObject.getNumObjectIndex(), GLES20.GL_UNSIGNED_SHORT, cubeObject.getIndices());

    GLES20.glDisable(GLES20.GL_CULL_FACE);

    // disable the enabled arrays after everything has been rendered
    GLES20.glDisableVertexAttribArray(vertexHandleBottle);
    GLES20.glDisableVertexAttribArray(normalHandleBottle);

    ShaderFactory.checkGLError("MultiTargets renderFrame");
  }

  GLES20.glDisable(GLES20.GL_BLEND);
  GLES20.glDisable(GLES20.GL_DEPTH_TEST);

  Renderer.getInstance().end();
}

My vertex shader:

attribute vec4 vertexPosition;
attribute vec3 vertexNormal;

uniform mat4 modelViewProjectionMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 normalMatrix;

// lighting
uniform vec4 uLightPosition;
uniform vec3 uLightColor;

// material
uniform vec3 uMatAmbient;
uniform vec3 uMatDiffuse;

// pass to fragment shader
varying vec3 vNormalEyespace;
varying vec3 vVertexEyespace;
varying vec4 vLightPositionEyespace;
varying vec3 vNormal;
varying vec4 vVertex;

void main()
{
  // we can just take vec3() of a vec4 and it will take the first 3 entries
  vNormalEyespace = vec3(normalMatrix * vec4(vertexNormal, 1.0));
  vNormal = vertexNormal;
  vVertexEyespace = vec3(modelViewMatrix * vertexPosition);
  vVertex = vertexPosition;

  // light position
  vLightPositionEyespace = modelViewMatrix * uLightPosition;

  gl_Position = modelViewProjectionMatrix * vertexPosition;
}

And my fragment shader:

precision highp float; //apparently necessary to force same precision as in vertex shader

//lighting
uniform vec4 uLightPosition;
uniform vec3 uLightColor;

//material
uniform vec3 uMatAmbient;
uniform vec3 uMatDiffuse;

//from vertex shader
varying vec3 vNormalEyespace;
varying vec3 vVertexEyespace;
varying vec4 vLightPositionEyespace;
varying vec3 vNormal;
varying vec4 vVertex;

void main()
{
 vec3 normalModel = normalize(vNormal);
 vec3 normalEyespace = normalize(vNormalEyespace);
 vec3 lightDirectionModel = normalize(uLightPosition.xyz - vVertex.xyz);
 vec3 lightDirectionEyespace = normalize(vLightPositionEyespace.xyz - vVertexEyespace.xyz);

 vec3 ambientTerm = uMatAmbient;
 vec3 diffuseTerm = uMatDiffuse * uLightColor;
 // calculate the lambert factor via cosine law
 float diffuseLambert = max(dot(normalEyespace, lightDirectionEyespace), 0.0);
 // Attenuate the light based on distance.
 float distance = length(vLightPositionEyespace.xyz - vVertexEyespace.xyz);
 float diffuseLambertAttenuated = diffuseLambert * (1.0 / (1.0 + (0.01 * distance * distance)));

 diffuseTerm = diffuseLambertAttenuated * diffuseTerm;

 gl_FragColor = vec4(ambientTerm + diffuseTerm, 1.0);
}

解决方案

I finally solved all problems. There were 2 issues that might be of interest for future readers.

  1. Vuforia CubeObject class from the official sample (current Vuforia version 4) has wrong normals. They do not all correspond with the vertex definition order. If you're using the CubeObject from the sample, make sure that the normal definitions are correctly corresponding with the faces. Vuforia fail...

  2. As suspected, my normalMatrix was wrongly built. We cannot just invert-transpose the 4x4 modelViewMatrix, we need to first extract the top left 3x3 submatrix from it and then invert-transpose that.

Here is the code that works for me:

  final Mat3 normalMatrixCube=new Mat3();
  normalMatrixCube.SetFrom4X4(modelViewMatrix);
  normalMatrixCube.invert();
  normalMatrixCube.transpose();

This code by itself is not that useful though, because it relies on a custom class Mat3 which I randomly imported from this guy because neither Android nor Vuforia seem to offer any matrix class that can invert/transpose 3x3 matrices. This really makes me question my sanity - the only code that works for such a basic problem has to rely on a custom matrix class? Maybe I'm just doing it wrong, I don't know...

这篇关于对于OpenGL ES 2.0的着色器漫:带摄像头的移动光线的变化(Vuforia在Android)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆