10

I want to move the initialization of my 3D texture from the CPU to the GPU. As a test, I wrote a shader to set all voxels to a constant value, but the texture is not modified at all. How do I make it work?

Compute shader:

#version 430

layout(local_size_x=1, local_size_y=1, local_size_z=1) in;
layout(r8, location = 0) uniform image3D volume;

void main()
{
  imageStore(volume, ivec3(gl_WorkGroupID), vec4(0));
}

Invocation:

glEnable(GL_TEXTURE_3D);
glActiveTexture(GL_TEXTURE0);
glGenTextures(1, &volume_tid);
glBindTexture(GL_TEXTURE_3D, volume_tid);
glTexImage3D(GL_TEXTURE_3D, 0, GL_R8, volume_dims[0], volume_dims[1], volume_dims[2], 0, GL_RED, GL_UNSIGNED_BYTE, voxels);

ShaderProgram computeVolumeShader;
computeVolumeShader.loadShader(GL_COMPUTE_SHADER, "compute_volume.glsl");
computeVolumeShader.link();
computeVolumeShader.use();
computeVolumeShader.uniform("volume", 0);
glBindImageTexture(0, volume_tid, 0, GL_FALSE, 0, GL_READ_WRITE, GL_R8);
glDispatchCompute(volume_dims[0], volume_dims[1], volume_dims[2]);
glBindImageTexture(0, 0, 0, GL_FALSE, 0, GL_READ_WRITE, GL_R8);
computeVolumeShader.unUse();
glMemoryBarrier(GL_ALL_BARRIER_BITS);

Note: voxels fed into glTexImage3D contains the CPU initialized data.

Andreas Haferburg
  • 5,189
  • 3
  • 37
  • 63
  • Have you tried using a real image format instead of `GL_RED`? – Nicol Bolas Jun 09 '13 at 23:32
  • @Nicol When I use `GL_R8UI`, I don't see anything anymore, even without the compute shader. I use `sampler3D` for rendering, does that make a difference? Since `GL_RED` is listed [here](http://www.opengl.org/sdk/docs/man/xhtml/glTexImage3D.xml) in table 1, I thought it's okay to use it? You're talking about the `internalFormat` parameter, right? – Andreas Haferburg Jun 09 '13 at 23:49
  • But you don't know what `GL_RED` will actually mean. It might give you `GL_R8`. It might give you `GL_R16`. When you don't pick a *specific* format, you give up the right to choose a format. And it stopped working when you used `GL_R8UI` because you probably didn't set your pixel transfer to [upload integral data instead of normalized floats](http://www.opengl.org/wiki/Pixel_Transfer#Integer_format). – Nicol Bolas Jun 10 '13 at 00:22
  • @Nicol Thank you for explaining. I changed it to unsigned normalized integers (updated code in the question), as I like having floats in the other shaders. Unfortunately, it still doesn't work. – Andreas Haferburg Jun 10 '13 at 19:21

1 Answers1

23

Ugh. So apparently a 3D texture has to be bound as layered, otherwise the shader isn't able to modify any uvw coordinate with w>0.

glBindImageTexture(0, volume_tid, 0, /*layered=*/GL_TRUE, 0, GL_READ_WRITE, GL_R8);

does the trick.

Andreas Haferburg
  • 5,189
  • 3
  • 37
  • 63
  • 2
    I lost half a day to this. Thank you!!! I don't think I ever would have figured it out. – Timothy Wright May 28 '15 at 19:43
  • 1
    Just also spent a crazy excursion through this. This was super confusing as it did work on a Nvidia GM107 but not GK110b (both latest driver). :-/ – FHoenig Aug 27 '15 at 04:01