1

The 3d obj(wavefront) model I'm loading and trying to apply some shading (through fixed pipeline+shaders combined for the purpose of simplicity) is being displayed wrong.

Now there are some problems i think i can spot. The light issue: doesn't seem static. It seems to rotate as i rotate the object with transformations. I am using gluLookAt+gluPerspective to set my view matrix and i've read that if you apply the light in opengl before applying gluLookAt your lights remain static in theory. This doesn't seem to be the case here.

Issue 2 is the way the model is being shaded. It's sort of choppy instead of smooth and i'm not even sure why that ring effect appears. I've loaded my mesh in ShaderMaker and applied my vertex+fragment shader also and everything draws perfect. It seems like i'm doing something wrong in my drawing routine.

enter image description here

My Vertex shader:

#version 120

varying vec3 N;
varying vec3 v;
void main(void)  
{    
   v = vec3(gl_ModelViewMatrix * gl_Vertex);      
   N = normalize(gl_NormalMatrix * gl_Normal);
   gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;  
}

My Fragment shader:

#version 120

varying vec3 N;
varying vec3 v;
#define MAX_LIGHTS 2
void main (void)  
{  
   vec4 finalColour;

        vec4 amb;
        amb = vec4(0.2,0.2,0.2,1.0);

        vec4 diff;
        diff = vec4(1.0,1.0,1.0,1.0);

       vec3 L = normalize(gl_LightSource[0].position.xyz - v);  
       vec3 E = normalize(v);
       vec3 R = normalize(reflect(L,-N));  
       //vec4 Iamb = gl_FrontLightProduct[i].ambient;    
        vec4 Iamb = amb;

        //vec4 Idiff = gl_FrontLightProduct[i].diffuse * max(dot(N,L), 0.0);
       vec4 Idiff = diff * max(dot(N,L), 0.0);

        Idiff = clamp(Idiff, 0.0, 1.0);    
       vec4 Ispec = gl_FrontLightProduct[0].specular
                    * pow(max(dot(R,E),0.0),0.0*gl_FrontMaterial.shininess);
       Ispec = clamp(Ispec, 0.0, 1.0);
       finalColour += Iamb + Idiff;

   gl_FragColor = gl_FrontLightModelProduct.sceneColor + finalColour;
}

EDIT :: Added my drawing routine.

 [[self openGLContext] makeCurrentContext];
    [sM useProgram];


    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glEnable(GL_DEPTH_TEST);
    glEnable(GL_LIGHTING);
    glClearColor(0.0f,0.0f,0.0f,1.0f);

    glMatrixMode(GL_PROJECTION);
    glLoadIdentity();

    float aspect = (float)1024 / (float)576;

    gluPerspective(15.0, aspect, 1.0, 15.0);

    glMatrixMode(GL_MODELVIEW);
    float lpos[4] = {0.0,2.0,0.0,1.0};
    glLoadIdentity();
    glLightfv(GL_LIGHT0, GL_POSITION, lpos);

    gluLookAt(0.0, 0.0, 10.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0);


    glTranslatef(0.0f, 0.0f, 0.0f);
    glScalef(0.5f, 0.5f, 0.5f);

    glRotatef(self.rotValue, 1.0f, 1.0f, 0.0f);



    glEnableClientState(GL_VERTEX_ARRAY);  
    glEnableClientState(GL_NORMAL_ARRAY);
    NSLog(@"%i", self.W);

    if(self.W == NO)
    {
        glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);
    } 
    if (self.W == YES)
    {
        glPolygonMode(GL_FRONT_AND_BACK, GL_LINE);
    }
    glColor3f(1.0f,0.0f,0.0f);
    glBindBuffer(GL_ARRAY_BUFFER, vBo[0]);
    glVertexPointer(3, GL_FLOAT, 0, 0);

    glBindBuffer(GL_ARRAY_BUFFER, vBo[1]);
    glNormalPointer(GL_FLOAT, 0, 0);

    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vBo[2]);
    glDrawElements(GL_TRIANGLES, object.indicesCount, GL_UNSIGNED_INT, 0);

    glBindBuffer(GL_ARRAY_BUFFER, 0);
    glDisableClientState(GL_NORMAL_ARRAY);
    glDisableClientState(GL_VERTEX_ARRAY);





    GLenum err = glGetError();
    if (err != GL_NO_ERROR)
        NSLog(@"glGetError(): %i", (int)err);

    [[self openGLContext] flushBuffer];

My shaders are actually borrowed from an online source. I have made my own but i wanted to make sure i use a solid one from someone experienced. I don't intend keeping them until i can fix this problem.

genpfault
  • 51,148
  • 11
  • 85
  • 139
apoiat
  • 576
  • 2
  • 5
  • 16
  • I don't think this particular shader is "a solid one". You need to normalize the normals in the fragment shader, not in the vertex shader. Vertex normals are vector attributes of the vertices, so they're linearly interpolated across the triangles. And the result of this interpolation is not normal anymore. The shader code is messy, the variables have terrible names. I'm not sure what to do here. I'm guessing you want _your_ shader to work, why are you posting someone else's code? (Upvote for the gif btw) – Andreas Haferburg Jul 14 '13 at 12:15
  • The first thing I'd do is verify that the normals are correct. Do `N = gl_Normal;` in the vertex and `gl_FragColor = normalize(N)*0.5+0.5;` in the fragment shader to visualize the object space normals. Or use `gl_FragColor = normalize(+-N)`, since the mapping to RGB is more readable. – Andreas Haferburg Jul 14 '13 at 12:27
  • The thing is (which i mention above) that i tried the shaders in ShaderMaker and Opengl Shader Builder(osx) and they both seem to work fine. They produce perfect smooth shading. So i went on with the assumption that shaders are working correctly. I'll implement your suggestions when i get back in my workstation and get back to you about the results. About the messy part in the shaders; oh well thats the result of me experimenting a bit with values. Apologies. – apoiat Jul 14 '13 at 12:42
  • Aha! Now I understand, my bad. Well I think your drawing code looks ok. The light's position is in eye space, so yes, it should not be affected by the view matrix or the object's rotation. – Andreas Haferburg Jul 14 '13 at 14:01
  • Ok so this is helpful to know. It's actually a glsl debugging mechanism of which i was not aware off. This is the result of vec4(normalize(N),1.0)*0.5+0.5 (http://oi42.tinypic.com/ouuse0.jpg). vec4(normalize(+N),1.0) / vec4(normalize(-N) output the same pattern in different colors. Im not sure how to interpret the result though. – apoiat Jul 14 '13 at 16:39
  • `normalize(N)` is easier to interpret, since (1,0,0) is simply red, (0,1,0) is green, (0,0,1) is blue. But you can only see half the normal range. You could use abs(), but then you can't see the sign anymore. So I usually use `normalize(N)` then `normalize(-N)`. But anyways, looks like the normals are the issue. It's probably going to look different in ShaderMaker. Where do you get them from? Could you post the code where you initialize `vBo[1]`? – Andreas Haferburg Jul 14 '13 at 17:04
  • http://pastebin.com/BdM0kMcv <- this is the part i 'upload' the vbo to my code. I know for sure all values in the object.normalArray are correct because i print the values before this step and everything matches the data inside the obj model file. The array is a one dimensional tightly packed array. [0].x [1].y [2].z / [3].x [4].y [5].z etc My obj file is pretty huge, can't upload it as text somewhere. But to be honest if ShaderMaker can render it normally core data shouldn't be the issue here. – apoiat Jul 14 '13 at 18:08
  • After taking some time to capture the behavior in ShaderMaker i am pretty sure it's a normal issue also. Take a look here, notice how different the color axes are (http://www151.lunapic.com/do-not-link-here-use-hosting-instead/137382722632489?7346535898) It has to be something wrong with the normals. – apoiat Jul 14 '13 at 18:46

1 Answers1

0

I think i've found the issue. Haven't implemented a solution yet but i am pretty sure the problem lies there. This question is also very relevant to my problem:

Understanding normals indices with Wavefront Obj

The problem was in the way i was handling my core data(from obj file). I was naive enough to believe passing them directly to OpenGL(through VBOs) as i parse them was going to work. Here's a dose of humour to accompany my mistake. I hope my link does not violate guideline rules. Special thanks to Andreas for helping me trace the problem. link

Update: A solution has been implemented and my models now render with correct shading.

I used the vertex and normal faces to rearrange the wavefront data(vertices+normals) into new vectors. I also switched from glDrawElements to glDrawArrays because glDrawElements does not support sending different indices for normal and uv vectors. Although this solution increases loading time and occupied space (since duplicated vertices are not removed from buffer) it is working and meets my needs.

Community
  • 1
  • 1
apoiat
  • 576
  • 2
  • 5
  • 16
  • Oh, I have to remember that meme, classical misunderstanding. ;) – Christian Rau Jul 15 '13 at 11:45
  • Short answer is, as rioki mentions in the link to the question i've provided, the obj format is optimized for storage and not for rendering. – apoiat Jul 15 '13 at 12:13
  • 1
    Ah, so OBJ uses a different set of indices for the vertex normals, but OpenGL doesn't. In OBJ, a face is defined as "f i0/j0/k0 i1/j1/k1 ... in/jn/kn" where i is the vertex index, j the texture coordinate index, k the normal index. Good to know. – Andreas Haferburg Jul 16 '13 at 07:06