0

I am using opengles 2.0. I am trying to pass an integer value to vertex shader. My client code is like this :

glEnableVertexAttribArray(3); // Bones
glVertexAttribPointer(3, 4, GL_UNSIGNED_SHORT, GL_FALSE, object->m_mesh->GetVertexSize(), (const void*)offset);

And the vertex shader code is :

attribute vec4 vBones;
uniform Bone bones[64];
gl_Position = bones[int(vBones.x)].transform * bones[int(vBones.x)].bindPose * vec4(vPosition, 1.0) * vWeights.x;

If I compile code as it is. All "vBones.xyzw" becomes 0 and I get an unskinned mesh. Because 0 refers to an identity matrix.

if I change client code to this :

glVertexAttribPointer(3, 4, GL_INT, GL_FALSE, object->m_mesh->GetVertexSize(), (const void*)offset);

Code runs without anyerror in windows. However when I compiled it to webgl via emscripten, I get gl error 1282 (Invalid Operation)

So briefly, can you give me an example of passing int vertex attribute to glsl ?

Cihan
  • 175
  • 1
  • 3
  • 14
  • `GL_INT` is not in the list of valid parameters. Source: https://www.khronos.org/opengles/sdk/docs/man/xhtml/glVertexAttribPointer.xml – Richard Critten Aug 26 '16 at 18:30
  • Yes, but it works just fine in windows :) – Cihan Aug 26 '16 at 18:34
  • Because you are most likely not running OpenGL ES on windows. – Peter K Aug 26 '16 at 18:42
  • I am using sdl to set opengl version. I don't know what sdl does. But I assure you I properly set it to opengl es ver 2.0. Because otherwise emscripten fails. – Cihan Aug 26 '16 at 18:47
  • 1
    Even if it does work on Windows it does not have to work anywhere else. `GL_INT` is not in the spec for this function and anyone implementing OpenGL ES 2.0 drivers does not have to handle it. You need to pass an array of `GL_UNSIGNED_SHORT` or `GL_SHORT` into the function. – Richard Critten Aug 26 '16 at 18:54
  • I can assure you that setting SDL to ES 2.0 on windows does not actually enforce true ES 2.0 restrictions. WebGL on the other hand DOES enforce ES 2.0 restrictions. If you want to run ES 2.0 on Windows use ANGLE. Although honestly I don't know if it still enforces ES 2.0 restrictions since it now supports up to ES 3.0 – gman Aug 26 '16 at 19:25

1 Answers1

1

WebGL 1.0 and GLSL ES 1.0X do not support passing GL_INT to vertex shaders.

From the spec section 2.8

enter image description here Table 2.4 indicates the allowable values for size and type. For type the values BYTE, UNSIGNED_BYTE, SHORT, UNSIGNED_SHORT, FIXED, and FLOAT, indicate types byte, ubyte, short, ushort, fixed, and float, respectively

FIXED is not supported in WebGL

Floats and Ints are the same size (32bits) and floats can represent ints up to 16 million with no loss of precision so unless you need values > 16 million you can just use floats

Otherwise in your example your accessing bones and you only have 64 bones. Why not just use UNSIGNED_BYTE or UNSIGNED_SHORT?

You might also want to consider storing your bone matrices in textures so you don't run out of uniforms as many devices probably have a lot fewer uniforms than your PC

See

How do you do skinning in WebGL

Community
  • 1
  • 1
gman
  • 100,619
  • 31
  • 269
  • 393