3

I'm trying to provide access to a 3-dimensional array of scalar data to a fragment shader, from a Python program using PyOpenGL.

In the fragment shader, I declare the a 3d sampler uniform

uniform sampler3D vol;

and in the Python program I have the following code to set up a scalar 3d texture

vol = numpy.random.rand(3, 3, 3).astype(np.float32)
texture = glGenTextures(1)
glUniform1i(glGetUniformLocation(program, "vol"), 0)
glActiveTexture(GL_TEXTURE0 + 0)
glBindTexture(GL_TEXTURE_3D, texture)
glTexImage3D(GL_TEXTURE_3D, 0, GL_RED, 3, 3, 3, 0, GL_RED, GL_FLOAT, vol)
glEnable(GL_TEXTURE_3D)

However, no matter where from the sampler I take values, e.g.

color = texture(vol, vec3(0, 0, 0));

it appears that I always obtain black (0, 0, 0).

What am I doing wrong?

I know that the basic setup of my fragment shader works, i.e. if I write color = vec3(1, 0, 0) I get red pixels.

I also know that there are no OpenGL errors, because I'm running the program with the option -glerror processed by glutInit(), which leads to OpenGL errors being translated into Python exceptions.

A. Donda
  • 8,381
  • 2
  • 20
  • 49

2 Answers2

2

That is because your GL_RED texture format is clamped to range <0,1> !!!

To remedy you need to use non clamped texture format or disable clamping ... Here examples that are working on my GL implementations:

here formats extracted from both:

glTexImage3D(GL_TEXTURE_3D, 0, GL_R16F, xs,ys,zs, 0, GL_RED, GL_FLOAT, dat);

glTexImage3D(GL_TEXTURE_3D, 0, GL_RGBA8, size, size, size, 0, GL_RGBA, GL_UNSIGNED_BYTE, pdata);

For scalar data I would use the first option. There are more formats that are not clamped just try and see...

I have never used the disabling of clamping feature but saw this code somewhere while researching similar issues (not sure if it works):

glClampColorARB(GL_CLAMP_VERTEX_COLOR_ARB, GL_FALSE);
glClampColorARB(GL_CLAMP_READ_COLOR_ARB, GL_FALSE);
glClampColorARB(GL_CLAMP_FRAGMENT_COLOR_ARB, GL_FALSE);

With that theoretically you could use any texture format...

To verify you can use this:

Also I do not see any parameters of the texture set. I would expect something like this:

glBindTexture(GL_TEXTURE_3D,txrvol);
glPixelStorei(GL_UNPACK_ALIGNMENT, 4);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_WRAP_S,GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_WRAP_T,GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_WRAP_R,GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MAG_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MIN_FILTER,GL_NEAREST);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE,GL_MODULATE);

to avoid interpolation messing your data for non exact texture coordinates ...

Spektre
  • 49,595
  • 11
  • 110
  • 380
  • The possibility of disabling clamping is certainly interesting, but I don't see how that can be the problem: The data I pass are random values in the interval [0, 1], and the color components are specified in the same range... – A. Donda May 10 '19 at 15:47
  • @A.Donda have you verified that the generated texture is not just zeros? Also did you check GLSL compilation and link logs for errors? Did you tried the `GL_NEAREST` filtering on all 3 axises? All of these can cause wrong output ... – Spektre May 10 '19 at 16:06
  • I figured it out, see my own answer. The solution suggests to me that a texture is not the best way to get non-image data into a fragment shader. Can you suggest a better way? I've seen that from 4.3, multidimensional array uniforms are supported... – A. Donda May 11 '19 at 01:28
1

I figured out the problem: Apparently GL_TEXTURE_3D is by default mipmapped, I only provided level 0, and (how I'm not clear about) another level is selected. The problem is solved by glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MAX_LEVEL, 0).

A. Donda
  • 8,381
  • 2
  • 20
  • 49
  • That is weird as I am passing also non texture data this way and had no problems without `glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MAX_LEVEL, 0)`. May be its driver specific I am using **nVidia** gfx cards or may be even environment specific as you are using different language and also using libs (not just GL/GLSL) so they might change some settings on their own too. Btw the `-glerror` option is handling just GL errors or also the GLSL errors? I am just curious I do not use GLUT – Spektre May 11 '19 at 07:32
  • @Spektre, I have a GeForce GTX 1060 with official drivers. – Yes, it also reports GLSL compilation errors. Thanks for your help! – A. Donda May 11 '19 at 18:56