2

I have the following GLSL shader that is working on a Mac computer with an NVIDIA GeForce GT 330M, a different Mac computer with an ATI Radeon HD 5750, an Ubuntu VM inside this second Mac, but not on an Ubuntu VM inside a Windows machine with a GeForce GTX 780 (all drivers up to date). The shader is pretty basic, so I'm looking for some help what might be wrong! The vertex shader looks like (I'm using the cocos2d-x game engine, which is where all of the CC_{x} variables are defined):

varying vec4 v_fragmentColor;
void main() {
    gl_Position = CC_PMatrix * CC_MVMatrix * a_position;
    gl_PointSize = CC_MVMatrix[0][0] * u_size * 1.5f;
    v_fragmentColor = vec4(1, 1, 1, 1);
}

And the fragment shader:

varying vec4 v_fragmentColor;

void main() {
   gl_FragColor = texture2D(CC_Texture0, gl_PointCoord) * v_fragmentColor; // Can't see anything
   // gl_FragColor = texture2D(CC_Texture0, gl_PointCoord); // Produces the texture as expected, no problems!
   // gl_FragColor = v_fragmentColor; // Produces a white box as expected, no problems!
}

As you can see, I'm getting very strange behavior where both the sampler, CC_Texture0, and the varying vec4, v_fragmentColor, seem to be working properly, but multiplying them causes problems. I'm reasonably confident everything else is set up right because I'm seeing it work properly on the other systems, so it seems to be related to the graphics card or some undefined behavior that I'm not aware of? Also, I'm using #version 120 ( which was needed for gl_PointCoord). Thanks for any help!

maxnelso
  • 85
  • 1
  • 6
  • 1
    I would try to compute the color to temp local variable first and then copy it to the `gl_FragColor`. some drivers have problems if you set `gl_FragColor` more then once. Of coarse you do not do that directly but may be multiplication is handled as separate operation for some reason (just a Guess ... with ATI/AMD you never know). Also what is the info log returning (may be some warning)? have you tried compatibility profile? (some drivers have problems with old GL stuff). Another cause may be different default precision setting in driver ... Also do you use Alpha channel? – Spektre Oct 30 '15 at 07:55
  • Hm I tried your suggestions to no avail, thanks for the help though! – maxnelso Oct 31 '15 at 00:58
  • what you got from `glGetShaderInfoLog` after each shader adn whole program compile link? (see [simple and complete GL+GLSL+VAO/VBO example in C++/VCL](http://stackoverflow.com/a/31913542/2521214) if you do not know how to use it) – Spektre Oct 31 '15 at 07:49
  • When I was looking into this it turns out the framework I'm using already does that on shader compilation (and I have seen errors in the past when I've made mistakes). I also manually put in `glGetShaderInfoLog` myself and didn't see anything. – maxnelso Oct 31 '15 at 21:05
  • if the result is empty that means no errors or warnings ... well the only thing I can think of is driver issue try to install different version (even older can help sometimes) – Spektre Nov 01 '15 at 07:23

0 Answers0