Strange drawing issue in 3.3 OpenGL fixed

Started by kodia, September 21, 2011, 04:28:55

Previous topic - Next topic

kodia

 ??? I have a few questions on the newer OpenGL. Mostly are the raw screen coordinates unchanged? Meaning 1,1 is top right -1,1 is top left ect ect
Also can you output coordinates WITHOUT a transform to test out drawing primitives and to have a reference to test out transforms? Meaning can I just send cordinates straight through the vertex shader and make it draw them like draw a triangle with vertexs -1x -1y 0x 0y 1x -1y to make a triangle go from bottom left to middle to bottom right?

I can post my code I am having issues with if needed but I mainly just need to know if it's possible to do what I am trying. I have read many guides that don't do transforms and get the program to draw just fine but mine doesn't always draw and sometimes changes shape even without changing anything. I use VAO with VBO attached that has coordinates for 3 vertices with all coords between -1 and 1. I use glDrawArray to draw with. I have a basic vertex shader that just passes the coordinates into vec4 without any transform and a basic fragment shader to just make it a solid single color.

Also why would it ALWAYS draw using floats but putting the vertex attribute to GL_INT while if I use GL_FLOAT I get the same issue? The triangle drawn is very tiny so my best guess is my viewpoint changes each run and without a transform there isn't a way to make it consistent but I need screen coordinates to mess around with transforms to understand it better. I have read tutorials but those are mostly for 2.0 or less versions while now all fixed pipeline is removed and you do all the transforms ect.

Sorry for the wall of text and thanks a head of time for any help.

kodia

Success!!!

Here is my entire code for others to reference. This makes VAO and Binds VBO to it and then draws a single triangle. I hope someone will benefit from my silly mistakes. This makes a 3.3 OpenGL context also and makes a static loop. I am not great at making variable ones but that is something I will work on later. Happy to finally get OpenGL working ^_^

try 
    {
    int alpha = 8;
    int depth = 8;
    int stencil = 0;
    
    int glversion1 = 3;
    int glversion2 = 3;
    
    ContextAttribs contextatt = new ContextAttribs(glversion1, glversion2);
    // pixelformat (int bpp, int alpha, int depth, int stencil, int samples)
    PixelFormat pixelform = new PixelFormat(alpha, depth, stencil);
    Display.setDisplayMode(new DisplayMode(800,600));
    Display.setTitle("OpenglTest");
    Display.create(pixelform, contextatt);
    glViewport(0,0,800,600);
    } 
    catch (LWJGLException e) 
    {
        System.exit(0);
    }
    glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);
    glDisable(GL_CULL_FACE);

    int programid;
    int vertshader;
    int fragshader;
    
    programid = glCreateProgram();
    vertshader = glCreateShader(GL_VERTEX_SHADER);
    fragshader = glCreateShader(GL_FRAGMENT_SHADER);
    glShaderSource(vertshader, "#version 330 core \n in vec3 inPosition; \n void main() { gl_Position = vec4(inPosition, 1.0); }");
    glShaderSource(fragshader, "#version 330 core \n out vec4 fragcolor; \n void main() { fragcolor = vec4(0.0, 1.0, 0.0, 1.0); }");
    glCompileShader(vertshader);
    glCompileShader(fragshader);

    glAttachShader(programid, vertshader);
    glAttachShader(programid, fragshader);
    glBindAttribLocation(programid, 0, "inPosition");
    glLinkProgram(programid);
    glUseProgram(programid);

    int[] idVBO = new int[1];
    int[] idVAO = new int[1];
    float[] testvertex = new float[9];

    testvertex[0] = 0.3f;
    testvertex[1] = 0.5f;
    testvertex[2] = 0.0f;
    
    testvertex[3] = 0.8f;
    testvertex[4] = -0.5f;
    testvertex[5] = 0.0f;
    
    testvertex[6] = 0.2f;
    testvertex[7] = 0.5f;
    testvertex[8] = 1.0f;
    
    ByteBuffer bytevertex = ByteBuffer.allocateDirect(36);
    bytevertex.order(ByteOrder.nativeOrder());
    FloatBuffer vertexbuff = bytevertex.asFloatBuffer();
    
    vertexbuff.put(testvertex);
    vertexbuff.rewind();
    
    idVAO[0] = glGenVertexArrays();
    glBindVertexArray(idVAO[0]);
    
    idVBO[0] = glGenBuffers();
    glBindBuffer(GL_ARRAY_BUFFER, idVBO[0]);
    glBufferData(GL_ARRAY_BUFFER, vertexbuff, GL_STATIC_DRAW);
    glVertexAttribPointer(0, 3, GL_FLOAT, false, 0, 0);
    glEnableVertexAttribArray(0);

    int frames = 0;
    
    while (!Display.isCloseRequested()) 
    {
        glDrawArrays(GL_TRIANGLES, 0, 3);
        checkGLErrorRender();
        
        Display.update();
        frames++;
        System.out.println("Frame " + frames);
        try
        {
            wait(30, 333333);
        }
        catch (Exception e)
        {
            System.exit(200);
        }
    }
    
    glBindBuffer(GL_ARRAY_BUFFER, 0);
    glBindVertexArray(0);
    glUseProgram(0);
    glDetachShader(programid, vertshader);
    glDetachShader(programid, fragshader);
    glDeleteShader(vertshader);
    glDeleteShader(fragshader);
    glDeleteProgram(programid);
    glDeleteBuffers(idVBO[0]);
    glDeleteVertexArrays(idVAO[0]);
    Display.destroy();


My problem was for one I was sometimes making vertex coordinates draw lines which obviously won't make a triangle appear but also OpenGL doesn't like non native ordered ByteBuffers for whatever reason.

Hope someone can benefit from my silly mistake ^_^

Chuck

You can replace this:
    ByteBuffer bytevertex = ByteBuffer.allocateDirect(36);
    bytevertex.order(ByteOrder.nativeOrder());
    FloatBuffer vertexbuff = bytevertex.asFloatBuffer();


With this:
    FloatBuffer vertexbuff = BufferUtils.createFloatBuffer(9);


Comes in awful handy.