0

Thanks for taking the time to read this. I am modifying my OpenGL application to use GLSL shaders, its purpose is to visualize 3D atomistic models, including tetrahedra (and other polyhedra) using GL_TRIANGLES.

Note: in the examples bellow, colors and alpha channels are identicals.

Before I had:

void triangle (GLFloat * va, GLFloat * vb, GLFloat * vc)
{
   glVertex3fv (va);
   glVertex3fv (vb);
   glVertex3fv (vc);
}
void tetra (GLFloat ** xyz)
{
  // xyz Contains the coordinates of the triangles
  // ie. the 4 peaks of the tetrahedra
  // bellow 'get_normal' compute the normals
  glBegin (GL_TRIANGLES);
  glNormal3fv (get_normal(xyz[0], xyz[1], xyz[2]));
  triangle (xyz[0], xyz[1], xyz[2]);
  glNormal3fv (get_normal(xyz[1], xyz[2], xyz[3]));
  triangle (xyz[1], xyz[2], xyz[3]);
  glNormal3fv (get_normal(xyz[2], xyz[3], xyz[0]));
  triangle (xyz[2], xyz[3], xyz[0]);
  glNormal3fv (get_normal(xyz[0], xyz[3], xyz[1]));
  triangle (xyz[0], xyz[3], xyz[1]);
  glEnd ();
}

That displayed (the proper result):

First version of the code using glNormal3fv + glVertex3fv

Then I modified the code to use the following shaders:

#define GLSL(src) "#version 130\n" #src

const GLchar * vertex = GLSL(
  uniform mat4 viewMatrix;
  uniform mat4 projMatrix;
  in vec3 position;
  in float size;
  in vec4 color;
  out vec4 vert_color;
  void main()
  {
    vert_color = color;
    gl_Position = projMatrix * viewMatrix * vec4(position, 1.0);
  }
);

const GLchar * colors = GLSL(
  in vec4 vert_color;
  out vec4 vertex_color;
  void main()
  {
    vertex_color = vert_color;
  }
);

And now I got:

New version of the code using Vertex and Fragment shaders

So obviously something is wrong with the colors/lighting, and I whish I could understand how to correct this, I think that I have to use geometry shaders but I do not know where to find the information to now how/what to do, thanks in advance for your help.

too honest for this site
  • 12,050
  • 4
  • 30
  • 52
  • Maybe a trivial thing, but it seems You do not use the normals in Your shaders. You should define it (just like color or size) as an input for them. – Dori Feb 09 '17 at 12:47
  • Thank for the answer, I tried to, but I do not know how to use it in the shader, note I do not know how to use the 'size' variable either, I wanted to use it to define point size of line width but so far I can not figure out how :-P – Sébastien Le Roux Feb 09 '17 at 12:56
  • And how do You actually do the draw call with the shaders? Generally one should bind the shaders to be used instead of fixed functionality as far as I remember. But still the vertex data have to be passed to GL (like in the glBegin/glEnd part). – Dori Feb 09 '17 at 13:03
  • "*`#define GLSL(src) "#version 130\n" #src`*" Really? In 2017? Has ***nobody*** in the OpenGL community heard of *raw string literals*? What is it with OpenGL users acting like C++ has never gotten any features since 2003? – Nicol Bolas Feb 09 '17 at 16:03
  • @NicolBolas are you? Because according to your profile you're an OpenGL user aswell :p – LJᛃ Feb 09 '17 at 16:47
  • 1
    As to the question: when you use shaders you have to do shading, duh? ;) – LJᛃ Feb 09 '17 at 16:49
  • see [OpenGL - vertex normals in OBJ](http://stackoverflow.com/a/31913542/2521214) for the new style OpenGl normal use and [How can I render an 'atmosphere' over a rendering of the Earth in Three.js?](http://stackoverflow.com/a/19659648/2521214) for the old style ... Also read this: [Explenation of working principle of openGL](http://stackoverflow.com/a/32047055/2521214) btw computing normals during rendering will be slow ... precompute once instead ... – Spektre Feb 09 '17 at 17:36
  • @Spektre thank you so much, this is exactly what I was looking for, really appreciate that you point me to right direction ! – Sébastien Le Roux Feb 10 '17 at 23:38

0 Answers0