6

I can send color to shader as 4 floats - no problem. However I want to send it as integer (or unsigned integer, doesn't really matter, what matters is 32 bits) and be decomposed in vec4 on shader.

I'm using OpenTK as C# wrapper for OpenGL (although it should be pretty much just a direct wrapper).

Let's consider one of the most simple shaders with vertex containing position(xyz) and color(rgba).

Vertex shader:

#version 150 core

in vec3 in_position;
in vec4 in_color;
out vec4 pass_color;
uniform mat4 u_WorldViewProj;

void main()
{
    gl_Position = vec4(in_position, 1.0f) * u_WorldViewProj;
    pass_color = in_color;
}

Fragment shader:

#version 150 core

in vec4 pass_color;
out vec4 out_color;

void main()
{
    out_color = pass_color;
}

Let's create vertex buffer:

public static int CreateVertexBufferColor(int attributeIndex, int[] rawData)
{
    var bufferIndex = GL.GenBuffer();
    GL.BindBuffer(BufferTarget.ArrayBuffer, bufferIndex);
    GL.BufferData(BufferTarget.ArrayBuffer, sizeof(int) * rawData.Length, rawData, BufferUsageHint.StaticDraw);
    GL.VertexAttribIPointer(attributeIndex, 4, VertexAttribIntegerType.UnsignedByte, 0, rawData);
    GL.EnableVertexAttribArray(attributeIndex);
    GL.BindBuffer(BufferTarget.ArrayBuffer, 0);
    return bufferIndex;
}

And I'm getting all zeros for vec4 'in_color' in vertex shader. Not sure what's wrong.

Closest thing what I found: https://www.opengl.org/discussion_boards/showthread.php/198690-I-cannot-send-RGBA-color-as-unsigned-int .

Also in VertexAttribIPointer I'm passing 0 as a stride, because I do have VertexBufferArray and keep data separated. So colors come tightly packed 32 bits (per color) per vertex.

chainerlt
  • 215
  • 3
  • 8
  • 1
    This is GLSL and not HLSL. The type has to be `ivec4` for an integral datatype – Rabbid76 Nov 05 '18 at 22:12
  • 1
    I don't think that integer attributes are what you want. You want just UNORM, hence you use `in vec4` in conjunction with `glVertexAttribPointer`. Not `glVertexAttribIPointer`. – derhass Nov 05 '18 at 22:31

2 Answers2

2

You have to use VertexAttribPointer and not VertexAttribIPointer when the input in your shader are floats (vec4).

Set the normalized parameter to GL_TRUE.

Spec says:

glVertexAttribPointer, if normalized is set to GL_TRUE, it indicates that values stored in an integer format are to be mapped to the range [-1,1] (for signed values) or [0,1] (for unsigned values) when they are accessed and converted to floating point. Otherwise, values will be converted to floats directly without normalization.

Rhu Mage
  • 667
  • 1
  • 8
  • 21
  • It looks logical to do that, however this does not work: `GL.VertexAttribPointer(attributeIndex, 4, VertexAttribPointerType.UnsignedByte, true, 0, rawData);` (`rawData` is `int[]` where integer contains all 32 bits for 4 channels RGBA or ARGB or whichever). I still get vec4 (0,0,0,0) in vertex shader, any ideas? – chainerlt Nov 07 '18 at 18:17
  • If you are trying to pack all 4 channels into a single int, you need to set `GL.VertexAttribIPointer(attributeIndex, 1, VertexAttribIntegerType.UnsignedByte, 0, rawData);` and in your shader receive just a single uint instead of vec4. Then you can unpack from uint to vec4 with `vec4 unpackUnorm4x8(uint p);`. Also make sure it's uint and not int. – Rhu Mage Nov 07 '18 at 18:52
  • Aight, so my shader looks like this: `#version 400 core in vec3 in_position; in uint in_color; out vec4 pass_color; uniform mat4 u_WorldViewProj; void main() { gl_Position = vec4(in_position, 1.0f) * u_WorldViewProj; pass_color = unpackUnorm4x8(in_color); }` and vertex atrributes: `GL.VertexAttribIPointer(attributeIndex, 1, VertexAttribIntegerType.UnsignedInt, 0, rawData);` You mentioned 'VertexAttribIntegerType.UnsignedByte' where I suppose it should be 'UnsignedInt', however none of them worked anyway, I still get zeros at the shader. – chainerlt Nov 07 '18 at 21:32
  • Yes, should be UnsignedInt. Did you check if rawData actually contains any values? Are you sure attributeIndex is correct? You can try to set it expicitly with: `GL.VertexAttribIPointer(1, 1, VertexAttribIntegerType.UnsignedInt, 0, rawData);` and in the shader `layout(location = 1) in uint in_color;`. And is rawData int or uint? Should be uint. – Rhu Mage Nov 08 '18 at 07:14
  • Well I'm still stuck. Simply cannot get it to work. However if I construct only one vertex buffer where I place all attributes in a sequence with stride (instead of building different buffer for attribute), that works just fine: e.g. having position as 3 floats and color as int: `GL.VertexAttribPointer(Shader.in_position, 3, VertexAttribPointerType.Float, false, vertexSize, IntPtr.Zero);` + `GL.VertexAttribPointer(Shader.in_color, 4, VertexAttribPointerType.UnsignedByte, true, vertexSize, new IntPtr(12));` – chainerlt Nov 16 '18 at 00:54
2

Aight, so this did the job for me:

Having managed data as int[] where it's tightly packed array of only colors (where int format is: RGBA meaning 0xAABBGGRR), then defining vertex attribute as: GL.VertexAttribPointer(index, 4, VertexAttribPointerType.UnsignedByte, true, sizeof(int), IntPtr.Zero) and using it in shader as: in vec4 in_color;.

chainerlt
  • 215
  • 3
  • 8