是OpenGL开发的GPU家属? [英] Is Opengl Development GPU Dependant?

查看:169
本文介绍了是OpenGL开发的GPU家属?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在开发在OpenGL Android应用程序ES2.0.In这个应用我曾经在GL surfaceView绘制多个线条和圆圈通过触摸事件。

如OpenGL依赖于GPU,目前,它工作正常,在谷歌Nexus 7(ULP的GeForce)。

在三星Galaxy Note 2(马里400MP)我想画一个以上的线,但它清除previous线,绘制当前行如新。

在索尼Xperia新V(为Adreno 205)我想画一条新的生产线,它崩溃的表面,如下面的图像。

是否有可能使其在所有设备上工作,或者我需要写code代表个人GPU?


来源$ C ​​$ C

MainActivity.java

  //在我的活动onCreate方法,我设置了glsurfaceview和渲染器

最后ActivityManager activityManager =
    (ActivityManager)getSystemService(Context.ACTIVITY_SERVICE);
最后ConfigurationInfo configurationInfo =
    activityManager.getDeviceConfigurationInfo();
最终布尔supportsEs2 =(configurationInfo.reqGlEsVersion> = 0x20000的
                  || Build.FINGERPRINT.startsWith(通用));

如果(supportsEs2){
    Log.i(JO,configurationInfo.reqGlEsVersion:
           + configurationInfo.reqGlEsVersion +supportsEs2:
           + supportsEs2);
//请求一个OpenGL ES 2.0兼容的环境。
    myGlsurfaceView.setEGLContextClientVersion(2);

    最后DisplayMetrics displayMetrics =新DisplayMetrics();
    getWindowManager().getDefaultDisplay().getMetrics(displayMetrics);

//设置渲染我们的演示渲染器,定义如下。
    myRenderer =新MyRenderer(这一点,myGlsurfaceView);
    myGlsurfaceView.setRenderer(myRenderer,displayMetrics.density);
    myGlsurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY);

    MyGLSurfaceView.java
//在这个即时得到我的glSurfaceView触摸的坐标划清界线,并//通过这些点来渲染器类
        公共MyGLsurfaceview(上下文的背景下){
        超(上下文);
        Log.i(JO,MyGLsurfaceview1);

    }

    公共MyGLsurfaceview(
    上下文的背景下,
    AttributeSet中的ATTRS)
    {
        超(背景下,ATTRS);
        CON =背景;
        mActivity =新MainActivity();
        mActivity.myGlsurfaceView =这一点;
        Log.i(JO,MyGLsurfaceview2);
    }

    公共无效setRenderer(
    MyRenderer渲染器,
    浮密度)
    {
        Log.i(JO,setRenderer);
        myRenderer =渲染器;
        myDensity =密度;
        mGestureDetector =新GestureDetector(CON,mGestureListener);
        super.setRenderer(渲染);
        setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY);

    }
    @覆盖公共布尔的onTouchEvent(MotionEvent EV){

        布尔retVal的= mGestureDetector.onTouchEvent(EV);

        如果(MYLINE){

            开关(ev.getAction()){

            案例MotionEvent.ACTION_DOWN:

                isLUp = FALSE;

                如果(计数== 1){
                    DX = ev.getX();
                    DY = ev.getY();
                    DX =(DX /(的getWidth()/ 2)) -  1;
                    DY = 1  - (DY /(的getHeight()/ 2));

                    firstX = DX;
                    firstY = DY;
                }否则,如果(计数== 2){

                    UX = ev.getX();
                    UY = ev.getY();
                    UX =(UX /(的getWidth()/ 2)) -  1;
                    UY = 1  - (UY /(的getHeight()/ 2));

                    secondX = UX;
                    secondY = UY;

                    myRenderer.dx = firstX;
                    myRenderer.dy = firstY;
                    myRenderer.ux = secondX;
                    myRenderer.uy = secondY;

                    midX =(firstX + secondX)/ 2;
                    MIDY =(firstY + secondY)/ 2;
                    Log.e(联办,
                           行:firstX+ firstX +
                           firstY+ firstY);
                    LP =新LinePoints(firstX,firstY,
                                 secondX,secondY,
                                 midX,MIDY);
                    lineArray.add(LP);

                    myRenderer.isNewClick = FALSE;
                    myRenderer.isEnteredAngle = FALSE;
                    myRenderer.myline =真;
                    myRenderer.mycircle = FALSE;
                    myRenderer.mydashedline = FALSE;
                    myRenderer.eraseCircle = FALSE;
                    myRenderer.eraseLine = FALSE;
                    myRenderer.eraseSelCir = FALSE;
                    myRenderer.angle =角;
                    myRenderer.length =长度;
                    requestRender();
                    计数= 0;

                }
                算上++;

                打破;
            案例MotionEvent.ACTION_MOVE:

                isLUp = TRUE;

                打破;

            案例MotionEvent.ACTION_UP:

                如果(isLUp){

                    UX = ev.getX();
                    UY = ev.getY();
                    UX =(UX /(的getWidth()/ 2)) -  1;
                    UY = 1  - (UY /(的getHeight()/ 2));
                    Log.i(JO,线路2:+通量+,+ UY);

                    secondX = UX;
                    secondY = UY;
                    myRenderer.dx = firstX;
                    myRenderer.dy = firstY;
                    myRenderer.ux = secondX;
                    myRenderer.uy = secondY;

                    midX =(firstX + secondX)/ 2;
                    MIDY =(firstY + secondY)/ 2;
                    Log.e(联办,
                           行:firstX+ firstX +
                           firstY+ firstY);
                    LP =新LinePoints(firstX,firstY,
                                 secondX,secondY,
                                 midX,MIDY);
                    lineArray.add(LP);

                    myRenderer.isNewClick = FALSE;
                    myRenderer.isEnteredAngle = FALSE;
                    myRenderer.myline =真;
                    myRenderer.mycircle = FALSE;
                    myRenderer.mydashedline = FALSE;
                    myRenderer.mysnaptoedge = FALSE;
                    myRenderer.mysnaptoMiddle = FALSE;
                    myRenderer.eraseCircle = FALSE;
                    myRenderer.eraseLine = FALSE;
                    myRenderer.eraseSelCir = FALSE;
                    数= 1;
                    requestRender();
                }

                打破;

            }
        }
    }
}
 

MyRenderer.java

  //渲染类来渲染线的glsurfaceview
行线;
公共MyRenderer(
    MainActivity mainActivity,
    MyGLsurfaceview myGlsurfaceView)
{
    Log.i(JO,MyRenderer);
    this.main = mainActivity;
    myGlsurface = myGlsurfaceView;

}

公共无效onDrawFrame(
    GL10 GL)
{
    line.draw(DX,DY,UX,UY);
}

@覆盖公共无效onSurfaceCreated(
    GL10 GL,
    EGLConfig配置)
{
    Log.i(JO,onSurfaceCreated);
//设置背景框的颜色
    GLES20.glClearColor(0.0,0.0,0.0,1.0F);
//创建GLText
    glText =新GLText(main.getAssets());

//从文件加载(集大小+填充)的字体​​,创建纹理
//注:一个成功调用此之后的字体是准备好
//渲染!
    glText.load(Roboto-Regular.ttf,14,2,2); //创建字体(身高:14
//像素/ X + Y填充
// 2像素数)
//启用纹理+ alpha混合
    GLES20.glEnable(GLES20.GL_BLEND);
    GLES20.glBlendFunc(GLES20.GL_ONE,GLES20.GL_ONE_MINUS_SRC_ALPHA);
}

@覆盖公共无效onSurfaceChanged(
    GL10 GL,
    INT宽度,
    INT高)
{
基于几何变化//调整视口,
//如屏幕旋转
    GLES20.glViewport(0,0,宽度,高度);

    比=(浮点)宽/高;

    width_surface =宽度;
    height_surface =高度;

/ *
* //这个投影矩阵应用到对象的坐标//在
* onDrawFrame()方法Matrix.frustumM(mProjMatrix,0,-ratio,比例,
* -1,1,3,7);
* /
//考虑到设备方向
    如果(宽>高度){
        Matrix.frustumM(mProjMatrix,0,-ratio,比率,-1,1,1,10);
    } 其他 {
        Matrix.frustumM(mProjMatrix,0,-1,1,-1 /比,1 /比值,
                 1,10);
    }

//保存宽度和高度
    this.width =宽度; //保存当前的宽度
    this.height =身高; //保存当前高度

    INT useForOrtho = Math.min(宽度,高度);

// TODO:这是错的?
    Matrix.orthoM(mVMatrix,0,-useForOrtho / 2,useForOrtho / 2,
               -useForOrtho / 2,useForOrtho / 2,0.1F,100F);
}
 

Line.java

  // Line类画线

公众班线
{

    最后弦乐vertexShader code =属性vec4 vPosition;
        +无效的主要(){+GL_POSITION = vPosition; +};

    最后弦乐fragmentShader code =precision mediump浮动;
        +制服vec4 vColor; +无效的主要(){
        +gl_FragColor = vColor; +};

    最后FloatBuffer vertexBuffer;
    最终诠释mProgram;
    INT mPositionHandle;
    INT mColorHandle;

//此数组中每个顶点的坐标数量
    最终诠释COORDS_PER_VERTEX = 3;
    浮lineCoords [] =新的浮动[6];
    最终诠释vertexCount = lineCoords.length / COORDS_PER_VERTEX;
    最终诠释vertexStride = COORDS_PER_VERTEX * 4; //字节每个顶点
//设置颜色有红,绿,蓝和alpha(透明度)值
    浮lcolor [] = {1.0F,1.0F,1.0F,1.0F};

    公用线路(
         )
    {

//初始化为形状坐标顶点字节的缓冲区
        ByteBuffer的BB = ByteBuffer.allocateDirect(
//(数坐标值的每浮子* 4个字节)
                                  lineCoords。
                                  长* 4);
//使用设备硬件的本机字节顺序
        bb.order(ByteOrder.nativeOrder());

//创建一个从ByteBuffer的浮点缓冲区
        vertexBuffer = bb.asFloatBuffer();

// prepare着​​色引擎和OpenGL程序
        INT vertexShader =
            MyRenderer.loadShader(GLES20.GL_VERTEX_SHADER,
                           vertexShader code);
        INT fragmentShader =
            MyRenderer.loadShader(GLES20.GL_FRAGMENT_SHADER,
                           fragmentShader code);

        mProgram = GLES20.glCreateProgram(); //创建空的OpenGL程序
        GLES20.glAttachShader(mProgram,vertexShader); //添加顶点着色器
//编程
        GLES20.glAttachShader(mProgram,fragmentShader); //添加片段
//着色器程序
        GLES20.glLinkProgram(mProgram); //创建OpenGL的可执行程序
    }

    公共无效画(
    浮霉素,
    浮DY,
    浮UX,
    浮动UY)
    {

        lineCoords [0] = dX的;
        lineCoords [1] = DY;
        lineCoords [2] = 0.0;
        lineCoords [3] = UX;
        lineCoords [4] = UY;
        lineCoords [5] = 0.0;
        Log.i(联办,
               lineCoords:+ lineCoords [0] +,+ lineCoords [1] +
               ,+ lineCoords [3] +,+ lineCoords [4]);

        vertexBuffer.put(lineCoords);
        vertexBuffer.position(0);
//添加程序的OpenGL环境
        GLES20.glUseProgram(mProgram);

//得到处理到顶点着色器的vPosition成员
        mPositionHandle =
            GLES20.glGetAttribLocation(mProgramvPosition);

//启用句柄三角形顶点
        GLES20.glEnableVertexAttribArray(mPositionHandle);

// prepare三角坐标数据
        GLES20.glVertexAttribPointer(mPositionHandle,
                          COORDS_PER_VERTEX,
                          GLES20.GL_FLOAT,假的,
                          vertexStride,vertexBuffer);

//得到处理,以片段着色器的vColor成员
        mColorHandle =
            GLES20.glGetUniformLocation(mProgramvColor);

//设置颜色绘制三角形
        GLES20.glUniform4fv(mColorHandle,1,lcolor,0);
        GLES20.glLineWidth(3);
//绘制三角形
        GLES20.glDrawArrays(GLES20.GL_LINES,0,vertexCount);

//禁用顶点数组
        GLES20.glDisableVertexAttribArray(mPositionHandle);
    }

}
 

解决方案

好吧,这里又来了:^ 1

  

OpenGL是不是一个场景图。 OpenGL的不维护一个场景,知道物体或保持几何形状的轨道。 OpenGL是的绘图的API。你给它一个画布(在窗口或一个pbuffer的形式),并责令其画点,线,三角形和OpenGL正是这么做的。一旦原语(=点,线,三角形)已经绘就,OpenGL的没有关于它的记忆任何责任。如果有新的变化,你必须重新绘制整个事情。

在适当的措施来重新绘制一个场景是:

  1. 禁用模板测试,让下面的步骤操作整个窗口上。

  2. 清除使用framebuffer glClear(位),其中的的是一个位掩码指定画布的部分来清除其中。当渲染一个新的框架,你想清楚了一切,所以位= GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT | GL_STENCIL_BUFFER_BIT ;

  3. 设置视区,建设apropriate投影矩阵

  4. 在场景中的每个对象调用合适的模型视图矩阵,一套校服,选择顶点数组,使绘图电话。

  5. 通过冲洗管道完成渲染。如果使用的是单缓冲窗口 glfinish在(),如果使用双缓冲窗口调用 SwapBuffers 。的情况下更高水平的框架,这可以通过在框架中进行。

重要一旦图纸已经完成上一个双缓冲窗口,你不能继续发送绘制操作,因为执行缓冲区交换你绘图的后台缓冲区的内容未定义。因此,你必须开始重新绘制,以清除帧缓冲开始(步骤1和2)。

您code缺失恰恰都是这两个步骤。我也有一个正在执行的OpenGL绘图电话直接反应到输入事件,可能是在输入事件处理程序自理的IM pression。 别这样!的。而是使用输入事件添加到原语(在你的案件线)绘制的名单,然后发送一个重绘事件,这使框架调用的绘图功能。在绘图函数遍历该列表绘制所需的行。

重绘整个场景是Canonical在OpenGL!


[1](geesh,我越来越有写这篇每3个问题还是很累,所以......)

I am developing an android application in opengl ES2.0.In this Application I used to draw multiple lines and circles by touch event in GL surfaceView.

As opengl depends on GPU, Currently it works fine in Google Nexus 7(ULP GeForce).

In Samsung Galaxy Note 2(MALI 400MP) I'm trying to draw more than one line, but it clears the previous line and draw current line as new.

In Sony Xperia Neo V(Adreno 205) I'm trying to draw a new line, it crashes the surface as shown in below image.

Is it possible to make it work on all devices or do I need to write code for Individual GPU?


Source code

MainActivity.java

//in OnCreate method of my activity, i set the glsurfaceview and renderer

final ActivityManager activityManager =
    ( ActivityManager ) getSystemService( Context.ACTIVITY_SERVICE );
final ConfigurationInfo configurationInfo =
    activityManager.getDeviceConfigurationInfo(  );
final boolean supportsEs2 = ( configurationInfo.reqGlEsVersion >= 0x20000
                  || Build.FINGERPRINT.startsWith( "generic" ) );

if( supportsEs2 ) {
    Log.i( "JO", "configurationInfo.reqGlEsVersion:"
           + configurationInfo.reqGlEsVersion + "supportsEs2:"
           + supportsEs2 );
// Request an OpenGL ES 2.0 compatible context.
    myGlsurfaceView.setEGLContextClientVersion( 2 );

    final DisplayMetrics displayMetrics = new DisplayMetrics(  );
    getWindowManager(  ).getDefaultDisplay(  ).getMetrics( displayMetrics );

// Set the renderer to our demo renderer, defined below.
    myRenderer = new MyRenderer( this, myGlsurfaceView );
    myGlsurfaceView.setRenderer( myRenderer, displayMetrics.density );
    myGlsurfaceView.setRenderMode( GLSurfaceView.RENDERMODE_CONTINUOUSLY );

    MyGLSurfaceView.java
//in this im getting the coordinates of my touch on the glSurfaceView to draw the line and //passing those points to the renderer class
        public MyGLsurfaceview( Context context ) {
        super( context );
        Log.i( "JO", "MyGLsurfaceview1" );

    }

    public MyGLsurfaceview(
    Context context,
    AttributeSet attrs )
    {
        super( context, attrs );
        con = context;
        mActivity = new MainActivity(  );
        mActivity.myGlsurfaceView = this;
        Log.i( "JO", "MyGLsurfaceview2" );
    }

    public void setRenderer(
    MyRenderer renderer,
    float density )
    {
        Log.i( "JO", "setRenderer" );
        myRenderer = renderer;
        myDensity = density;
        mGestureDetector = new GestureDetector( con, mGestureListener );
        super.setRenderer( renderer );
        setRenderMode( GLSurfaceView.RENDERMODE_CONTINUOUSLY );

    }
    @Override public boolean onTouchEvent( MotionEvent ev ) {

        boolean retVal = mGestureDetector.onTouchEvent( ev );

        if( myline ) {

            switch ( ev.getAction(  ) ) {

            case MotionEvent.ACTION_DOWN:

                isLUp = false;

                if( count == 1 ) {
                    dx = ev.getX(  );
                    dy = ev.getY(  );
                    dx = ( dx / ( getWidth(  ) / 2 ) ) - 1;
                    dy = 1 - ( dy / ( getHeight(  ) / 2 ) );

                    firstX = dx;
                    firstY = dy;
                } else if( count == 2 ) {

                    ux = ev.getX(  );
                    uy = ev.getY(  );
                    ux = ( ux / ( getWidth(  ) / 2 ) ) - 1;
                    uy = 1 - ( uy / ( getHeight(  ) / 2 ) );

                    secondX = ux;
                    secondY = uy;

                    myRenderer.dx = firstX;
                    myRenderer.dy = firstY;
                    myRenderer.ux = secondX;
                    myRenderer.uy = secondY;

                    midX = ( firstX + secondX ) / 2;
                    midY = ( firstY + secondY ) / 2;
                    Log.e( "JO",
                           "Line:firstX" + firstX +
                           "firstY" + firstY );
                    lp = new LinePoints( firstX, firstY,
                                 secondX, secondY,
                                 midX, midY );
                    lineArray.add( lp );

                    myRenderer.isNewClick = false;
                    myRenderer.isEnteredAngle = false;
                    myRenderer.myline = true;
                    myRenderer.mycircle = false;
                    myRenderer.mydashedline = false;
                    myRenderer.eraseCircle = false;
                    myRenderer.eraseLine = false;
                    myRenderer.eraseSelCir = false;
                    myRenderer.angle = angle;
                    myRenderer.length = length;
                    requestRender(  );
                    count = 0;

                }
                count++;

                break;
            case MotionEvent.ACTION_MOVE:

                isLUp = true;

                break;

            case MotionEvent.ACTION_UP:

                if( isLUp ) {

                    ux = ev.getX(  );
                    uy = ev.getY(  );
                    ux = ( ux / ( getWidth(  ) / 2 ) ) - 1;
                    uy = 1 - ( uy / ( getHeight(  ) / 2 ) );
                    Log.i( "JO", "line2:" + ux + "," + uy );

                    secondX = ux;
                    secondY = uy;
                    myRenderer.dx = firstX;
                    myRenderer.dy = firstY;
                    myRenderer.ux = secondX;
                    myRenderer.uy = secondY;

                    midX = ( firstX + secondX ) / 2;
                    midY = ( firstY + secondY ) / 2;
                    Log.e( "JO",
                           "Line:firstX" + firstX +
                           "firstY" + firstY );
                    lp = new LinePoints( firstX, firstY,
                                 secondX, secondY,
                                 midX, midY );
                    lineArray.add( lp );

                    myRenderer.isNewClick = false;
                    myRenderer.isEnteredAngle = false;
                    myRenderer.myline = true;
                    myRenderer.mycircle = false;
                    myRenderer.mydashedline = false;
                    myRenderer.mysnaptoedge = false;
                    myRenderer.mysnaptoMiddle = false;
                    myRenderer.eraseCircle = false;
                    myRenderer.eraseLine = false;
                    myRenderer.eraseSelCir = false;
                    count = 1;
                    requestRender(  );
                }

                break;

            }
        }
    }
}

MyRenderer.java

//renderer class to render the line to the glsurfaceview
Lines line;
public MyRenderer(
    MainActivity mainActivity,
    MyGLsurfaceview myGlsurfaceView )
{
    Log.i( "JO", "MyRenderer" );
    this.main = mainActivity;
    myGlsurface = myGlsurfaceView;

}

public void onDrawFrame(
    GL10 gl )
{
    line.draw( dx, dy, ux, uy );
}

@Override public void onSurfaceCreated(
    GL10 gl,
    EGLConfig config )
{
    Log.i( "JO", "onSurfaceCreated" );
// Set the background frame color
    GLES20.glClearColor( 0.0f, 0.0f, 0.0f, 1.0f );
// Create the GLText
    glText = new GLText( main.getAssets(  ) );

// Load the font from file (set size + padding), creates the texture
// NOTE: after a successful call to this the font is ready for
// rendering!
    glText.load( "Roboto-Regular.ttf", 14, 2, 2 );  // Create Font (Height: 14
// Pixels / X+Y Padding
// 2 Pixels)
// enable texture + alpha blending
    GLES20.glEnable( GLES20.GL_BLEND );
    GLES20.glBlendFunc( GLES20.GL_ONE, GLES20.GL_ONE_MINUS_SRC_ALPHA );
}

@Override public void onSurfaceChanged(
    GL10 gl,
    int width,
    int height )
{
// Adjust the viewport based on geometry changes,
// such as screen rotation
    GLES20.glViewport( 0, 0, width, height );

    ratio = ( float ) width / height;

    width_surface = width;
    height_surface = height;

/*
* // this projection matrix is applied to object coordinates // in the
* onDrawFrame() method Matrix.frustumM(mProjMatrix, 0, -ratio, ratio,
* -1, 1, 3, 7);
*/
// Take into account device orientation
    if( width > height ) {
        Matrix.frustumM( mProjMatrix, 0, -ratio, ratio, -1, 1, 1, 10 );
    } else {
        Matrix.frustumM( mProjMatrix, 0, -1, 1, -1 / ratio, 1 / ratio,
                 1, 10 );
    }

// Save width and height
    this.width = width; // Save Current Width
    this.height = height;   // Save Current Height

    int useForOrtho = Math.min( width, height );

// TODO: Is this wrong?
    Matrix.orthoM( mVMatrix, 0, -useForOrtho / 2, useForOrtho / 2,
               -useForOrtho / 2, useForOrtho / 2, 0.1f, 100f );
}

Line.java

//Line class to draw line

public class Lines
{

    final String vertexShaderCode = "attribute vec4 vPosition;"
        + "void main() {" + " gl_Position = vPosition;" + "}";

    final String fragmentShaderCode = "precision mediump float;"
        + "uniform vec4 vColor;" + "void main() {"
        + " gl_FragColor = vColor;" + "}";

    final FloatBuffer vertexBuffer;
    final int mProgram;
    int mPositionHandle;
    int mColorHandle;

// number of coordinates per vertex in this array
    final int COORDS_PER_VERTEX = 3;
    float lineCoords[] = new float[6];
    final int vertexCount = lineCoords.length / COORDS_PER_VERTEX;
    final int vertexStride = COORDS_PER_VERTEX * 4; // bytes per vertex
// Set color with red, green, blue and alpha (opacity) values
    float lcolor[] = { 1.0f, 1.0f, 1.0f, 1.0f };

    public Lines(
         )
    {

// initialize vertex byte buffer for shape coordinates
        ByteBuffer bb = ByteBuffer.allocateDirect(
// (number of coordinate values * 4 bytes per float)
                                  lineCoords.
                                  length * 4 );
// use the device hardware's native byte order
        bb.order( ByteOrder.nativeOrder(  ) );

// create a floating point buffer from the ByteBuffer
        vertexBuffer = bb.asFloatBuffer(  );

// prepare shaders and OpenGL program
        int vertexShader =
            MyRenderer.loadShader( GLES20.GL_VERTEX_SHADER,
                           vertexShaderCode );
        int fragmentShader =
            MyRenderer.loadShader( GLES20.GL_FRAGMENT_SHADER,
                           fragmentShaderCode );

        mProgram = GLES20.glCreateProgram(  );  // create empty OpenGL Program
        GLES20.glAttachShader( mProgram, vertexShader );    // add the vertex shader
// to program
        GLES20.glAttachShader( mProgram, fragmentShader );  // add the fragment
// shader to program
        GLES20.glLinkProgram( mProgram );   // create OpenGL program executables
    }

    public void draw(
    float dX,
    float dY,
    float uX,
    float uY )
    {

        lineCoords[0] = dX;
        lineCoords[1] = dY;
        lineCoords[2] = 0.0f;
        lineCoords[3] = uX;
        lineCoords[4] = uY;
        lineCoords[5] = 0.0f;
        Log.i( "JO",
               "lineCoords:" + lineCoords[0] + "," + lineCoords[1] +
               "," + lineCoords[3] + "," + lineCoords[4] );

        vertexBuffer.put( lineCoords );
        vertexBuffer.position( 0 );
// Add program to OpenGL environment
        GLES20.glUseProgram( mProgram );

// get handle to vertex shader's vPosition member
        mPositionHandle =
            GLES20.glGetAttribLocation( mProgram, "vPosition" );

// Enable a handle to the triangle vertices
        GLES20.glEnableVertexAttribArray( mPositionHandle );

// Prepare the triangle coordinate data
        GLES20.glVertexAttribPointer( mPositionHandle,
                          COORDS_PER_VERTEX,
                          GLES20.GL_FLOAT, false,
                          vertexStride, vertexBuffer );

// get handle to fragment shader's vColor member
        mColorHandle =
            GLES20.glGetUniformLocation( mProgram, "vColor" );

// Set color for drawing the triangle
        GLES20.glUniform4fv( mColorHandle, 1, lcolor, 0 );
        GLES20.glLineWidth( 3 );
// Draw the triangle
        GLES20.glDrawArrays( GLES20.GL_LINES, 0, vertexCount );

// Disable vertex array
        GLES20.glDisableVertexAttribArray( mPositionHandle );
    }

}

解决方案

Okay, here it goes again: ^1

OpenGL is not a scene graph. OpenGL does not maintain a scene, knows about objects or keeps tracks of geometry. OpenGL is a drawing API. You give it a canvas (in form of a Window or a PBuffer) and order it to draw points, lines or triangles and OpenGL does exactly that. Once a primitive (=point, line, triangle) has been drawn, OpenGL has no recollection about it whatsoever. If something changes, you have to redraw the whole thing.

The proper steps to redraw a scene are:

  1. Disable the stencil test, so that the following step operates on the whole window.

  2. Clear the framebuffer using glClear(bits), where bits is a bitmask specifying which parts of the canvas to clear. When rendering a new frame you want to clear everything so bits = GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT | GL_STENCIL_BUFFER_BIT;

  3. set the viewport, build an apropriate projection matrix

  4. for each object in the scene load the right modelview matrix, set uniforms, select the vertex arrays and make the drawing call.

  5. finish the rendering by flushing the pipeline. If using a single buffered window glFinish(), if using a double buffered window call SwapBuffers. In case of higher level frameworks this may be performed by the framework.

Important Once the drawing has been finished on a double buffered window, you must not continue to send drawing operations, as by performing the buffer swap the contents of the back buffer you're drawing to are undefined. Hence you must start the drawing anew, beginning with clearing the framebuffer (steps 1 and 2).

What your code misses are exactly those two steps. Also I have the impression that you're performing OpenGL drawing calls in direct reaction to input events, possibly in the input event handlers themself. Don't do this!. Instead use the input events to add to a list of primitives (lines in your case) to draw, then send a redraw event, which makes the framework call the drawing function. In the drawing function iterate over that list to draw the desired lines.

Redrawing the whole scene is canonical in OpenGL!


[1] (geesh, I'm getting tired of having to write this every 3rd question or so…)

这篇关于是OpenGL开发的GPU家属?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆