4

I've been writing shaders like this:

#version 330 core
in vec2 tex_position;

uniform sampler2D texture_in;

void main(){
  gl_FragColor = texture(texture_in, vec2(tex_position.x, 1.0 - tex_position.y));
}

It compiles and runs fine, but I'm told by my friend (using the same OpenGL/GLSL version on a different computer) that it's failing because gl_FragColor has been removed. How can I force these shaders to only use the standard GLSL 330 and not nonstandard/deprecated things?

genpfault
  • 51,148
  • 11
  • 85
  • 139
bfops
  • 5,348
  • 5
  • 36
  • 48
  • I don't know any way to force the GLSL compiler to reject deprecated things, but it's probably worth checking the log (`glGetShaderInfoLog`) for any warnings. – GuyRT Jul 24 '14 at 19:04
  • 1
    You're doing the right thing. My understanding is that including `core` in the version should do exactly what you're looking for. Looks like the graphics vendor that provided the shader compiler for your machine is somewhat liberal with their error checking. Certainly try what @GuyRT suggested, and see if they at least give you warnings. – Reto Koradi Jul 24 '14 at 19:14
  • If your driver actually doesn't give you any diagnostics at all, you might be better off developing on a GPU from another vendor... – fintelia Jul 24 '14 at 19:54
  • 1
    Are you creating a core or compatibility context? The OpenGL4.3 (Compatibility) Specification mentions writing to `gl_FragColor` or user-defined outputs (with no mention of having to use `#version xxx compatibility`), whereas the The OpenGL4.3 (Core) Specification has no mention of `gl_FragColor` at all. That said, I do agree that the obvious reading from the GLSL spec suggests that including `core` should force errors. – GuyRT Jul 25 '14 at 00:06
  • AFAIK Mesa & MacOS X have best core profile restrictions, as compatibility profile is not implemented there at all. You can try these. But, as said above, you are doing right thing. It's a bug - which GPU & driver is it? Also, as said above, you can make harder enforcement, by switching whole GL context to core profile. – sacygan Jul 30 '14 at 23:17
  • While there is no guaranteed solution, I use combination of `cgc`, `glslang` (reference 'compiler' provided by khronos) and mesa's shader compiler. If at least one of them finding something, it is already suspicious. – keltar Feb 04 '15 at 03:32

1 Answers1

0

I think it is exactly, what others wrote to comments. It is maybe wrong context attribute. Usually native created contexts are in Legacy Profile. You must switch your context to Core Profile (minimal version 3.2). Then you get different version of GLSL (minimal version 1.50) and your compiler will ignore these deprecated things.

If you are using OS X, then read my answer: OpenGL - ARB extension

Community
  • 1
  • 1
eSeverus
  • 552
  • 1
  • 6
  • 18