iOS. OpenGL.同时几个视图 [英] iOS. OpenGL. Several views simultaneously

查看:88
本文介绍了iOS. OpenGL.同时几个视图的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

几天后,我问到在两者之间填充空间两条具有梯度的曲线. 从那以后,我在代码中做了很多更改-现在我正在使用着色器.结果几乎是我所需要的.但!同样,我在一个控制器中同时使用多个OpenGL视图时遇到问题,但它们必须彼此绝对独立. 因此,感谢 raywenderlich.com令人难以置信的教程来自布拉德利的代码示例情况:

a couple of days before i asked about filling space between two curve lines with gradient. Since that i've changed a lots of in my code - now i'm using shaders. The result is almost as i need. BUT! Again i have a problem with using several OpenGL views simultaneously in one controller but they must be absolutely independent between each other. So, thanks to unbelievable tutorial from raywenderlich.com and code example from bradley i have next situation:

  • SoundWaveView.swift类的两个自定义对象(从bradley的OpenGLView.swift中改进);
  • openglView对象的顶点/片段着色器;
  • 在其中创建并添加两个SoundWaveView对象(具有相同上下文!!!!)的控制器:

c

override func viewDidLoad() {
    super.viewDidLoad()


    let context = EAGLContext(API: EAGLRenderingAPI.OpenGLES2)

    let viewPos1: CGRect = CGRectMake(0, 150, UIScreen.mainScreen().bounds.size.width, 150);
    let view1 = SoundWaveView(frame: viewPos1, fileName: "Personal Jesus", fileExtention: "mp3", c: context)
    view1.backgroundColor = UIColor.grayColor()
    self.view.addSubview(view1)


    let viewPos2: CGRect = CGRectMake(0, 350, UIScreen.mainScreen().bounds.size.width, 150);
    let view2 = SoundWaveView(frame: viewPos2, fileName: "Imagine", fileExtention: "mp3", c: context)
    view2.backgroundColor = UIColor.blackColor()
    self.view.addSubview(view2)
}

每个SoundWaveView对象都会像上面的教程中一样编译着色器:

Every SoundWaveView-object compiles shaders as in tutorials above:

    func compileShaders() {

    let vertexShader = compileShader("SimpleVertex", shaderType: GLenum(GL_VERTEX_SHADER))
    let fragmentShader = compileShader("SimpleFragment", shaderType: GLenum(GL_FRAGMENT_SHADER))

    let programHandle: GLuint = glCreateProgram()
    glAttachShader(programHandle, vertexShader)
    glAttachShader(programHandle, fragmentShader)
    glLinkProgram(programHandle)

    var linkSuccess: GLint = GLint()
    glGetProgramiv(programHandle, GLenum(GL_LINK_STATUS), &linkSuccess)
    if (linkSuccess == GL_FALSE) {
        var message = [CChar](count: 256, repeatedValue: CChar(0))
        var length = GLsizei(0)
        glGetProgramInfoLog(programHandle, 256, &length, &message)
        print("Failed to create shader program! : \( String(UTF8String: message) ) ")
        exit(1);
    }

    glUseProgram(programHandle)

    .  .  .  .  .
    shaderPeakId = glGetUniformLocation(programHandle, "peakId")

    shaderWaveAmp = glGetUniformLocation(programHandle, "waveAmp");

    glEnableVertexAttribArray(positionSlot)
    glEnableVertexAttribArray(colorSlot)

    glUniform1f(shaderSceneWidth!,  Float(self.frame.size.width));
    glUniform1f(shaderSceneHeight!, Float(self.frame.size.height));
    glUniform1i(shaderPeaksTotal!,  GLint(total));
    glUniform1i(shaderPeakId!,      GLint(position));

}

对于渲染音频波,需要传递给着色器值,如A)振幅(我从avaudioplayer获得-完美工作,并且对于我在控制器中创建的两个对象,我对两个不同的音轨具有不同的振幅)和B)数字(又名peakId/位置).

For rendering audio waves a need to pass to the shader values as A) amplitude (i get them from avaudioplayer - it work perfect and for two objects i create in controller i have different amplitudes for two different tracks) and B) wave number (aka peakId/position).

因此对于第一个创建的对象(灰色),位置等于1,并且该波必须为红色,并放置在屏幕的左侧.对于第二个对象(黑色),位置为3,则波浪为蓝色,并放置在右侧. /位置2用于屏幕中心,现在不使用/

So for the first created object(grey) there position is equal 1 and this wave must be red and placed in left part of the screen. For the second object (black) position is 3, wave is blue and placed in right part. /position 2 is for center at screen, not use now/

每次音频更新都会导致opengl的更新:

Every audio update causes update for opengl:

var waveAmp :Float = 0

.    .    .    .    . 

func updateMeters(){

    player!.updateMeters()
    var normalizedValue = normalizedPowerLevelFromDecibels(player!.averagePowerForChannel(0))

    waveAmp = normalizedValue!
}

func render(displayLink: CADisplayLink?) {

        glBlendFunc(GLenum(GL_ONE), GLenum(GL_ONE_MINUS_SRC_ALPHA))
        glEnable(GLenum(GL_BLEND))

        //            glClearColor(0, 104.0/255.0, 55.0/255.0, 1.0)
        glClearColor(0, 0, 0, 0)
        glClear(GLenum(GL_COLOR_BUFFER_BIT) | GLenum(GL_DEPTH_BUFFER_BIT))
        glEnable(GLenum(GL_DEPTH_TEST))

        glViewport(0, 0, GLsizei(self.frame.size.width), GLsizei(self.frame.size.height))

        glBindBuffer(GLenum(GL_ARRAY_BUFFER), vertexBuffer)
        glBindBuffer(GLenum(GL_ELEMENT_ARRAY_BUFFER), indexBuffer)

        glVertexAttribPointer(positionSlot, 3, GLenum(GL_FLOAT), GLboolean(UInt8(GL_FALSE)), GLsizei(sizeof(Vertex)), nil)
        glVertexAttribPointer(colorSlot, 4, GLenum(GL_FLOAT), GLboolean(UInt8(GL_FALSE)), GLsizei(sizeof(Vertex)), UnsafePointer<Void>(bitPattern: (sizeof(Float) * 3)))

        glDrawElements(GLenum(GL_TRIANGLE_FAN), GLsizei(Indices.size()), GLenum(GL_UNSIGNED_BYTE), nil)

        glVertexAttribPointer(positionSlot, 3, GLenum(GL_FLOAT), GLboolean(UInt8(GL_FALSE)), GLsizei(sizeof(Vertex)), nil);
        glVertexAttribPointer(colorSlot, 4, GLenum(GL_FLOAT), GLboolean(UInt8(GL_FALSE)), GLsizei(sizeof(Vertex)), UnsafePointer<Void>(bitPattern: (sizeof(Float) * 3)));

        context.presentRenderbuffer(Int(GL_RENDERBUFFER))


    print("position=\(position)  waveAmp=\(waveAmp)")
    glUniform1i(shaderPeakId!,  GLint(position));
    glUniform1f(shaderWaveAmp!, waveAmp);
}

使用print()函数登录:

Log with print()-function:

position=1  waveAmp=0.209313
position=3  waveAmp=0.47332
position=1  waveAmp=0.207556
position=3  waveAmp=0.446235
position=1  waveAmp=0.20769
position=3  waveAmp=0.446235
position=1  waveAmp=0.246206
position=3  waveAmp=0.430118

我希望我将有两个矩形-一个灰色,左边是红色波浪,另一个是黑色,右边是蓝线. 但是我有下一个麻烦!渲染在一个(最后一个)视图中进行,并顺序更改:

I expect that i will have two rectangles - one grey with red wave on left and one black with blue line on right. BUT i have next trouble! Rendering take place in one (last) view and changes sequentially:

请任何人!如何同时使用两个glView? 或者也许我需要在一个视图中使用两个着色器?

Please anyone! How to use two glViews simultaneously? Or maybe i need to use two shaders in one view?

我想完成此音频任务,并与大家分享.但是我还是不明白这个问题

I want to finish this audio wave task and share them with everyone. But i still don't understand this problem

推荐答案

在您发布的SoundWaveView类的代码中,看起来该视图将处理上下文的初始化,该上下文的帧缓冲区等.由于您向两个视图都发送了相同的EAGLContext,因此意味着正在初始化相同的上下文,然后很可能由第二个视图重新初始化.在我看来,这不合适.如果仅为另一个视图创建单独的上下文,则实际上可能会很好地工作,并且GPU将在在这两个视图之间进行渲染时仅进行上下文切换,并将它们视为独立的应用程序.

Looking in the code for the SoundWaveView class you posted, it looks like that view handles initializing the context, the framebuffer for the context, and more for you. Since you send the same EAGLContext to both views, that means the same context is being initialized, then likely re-initialized by the second view. That doesn't seem right to me. It might actually work fine if you make just a separate context for the other view, and the GPU will just context-switch when rendering between these two views, and treat them as if they were separate apps.

但这不是应该设计这种简单的应用程序的方式.如果要同时在屏幕上绘制两个对象,那当然并不意味着您需要两个单独的视图.您将必须构建一个系统来处理从每个波形中获取的信息,并将其自动转换为适当的绘图调用.这将充当您的应用程序的迷你绘制引擎.通过这种方法,您甚至可以为更酷的技术和图形编写更多的支持……也许更多的波形,它们各自的不同纹理/颜色,炫酷效果等.

But that's not how an application this simple should be designed. If you want two objects drawn on the screen at the same time, that certainly doesn't mean you need two separate views. You would have to build a system that handles taking the information from each waveform and converting it into the appropriate draw calls automatically. This would act as a mini draw engine for your application. With this approach, you could write further support for even cooler techniques and drawings... maybe more waveforms, different textures/colors for each of them, cool effects, etc.

这篇关于iOS. OpenGL.同时几个视图的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆