I am trying to implement palette-indexed textures into my game (while learning about OpenGL) and have been somewhat successful, however I noticed that there seems to be a rounding issue when calculating the color map row.
Essentially I have 3 parameters:
- texture (contains palette indexes)
- color map (this is where my problem lies, there are 1700+ rows in a texture, with row representing a different color map)
- palette
Vertex Shader
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord0;
uniform mat4 u_projTrans;
varying vec2 v_texCoords;
varying float colormapIndex;
void main() {
colormapIndex = a_color.r;
v_texCoords = a_texCoord0;
gl_Position = u_projTrans * a_position;
}
Fragment Shader
uniform sampler2D ColorTable; //256 x 1 pixels
uniform sampler2D ColorMap; //256 x 1727 pixels (this many maps)
uniform sampler2D MyIndexTexture;
varying vec2 v_texCoords;
varying float colormapIndex;
void main() {
vec4 color = texture2D(MyIndexTexture, v_texCoords);
vec2 mappedIndex = vec2(color.r, colormapIndex);
vec4 mapping = texture2D(ColorMap, mappedIndex);
vec4 texel = texture2D(ColorTable, mapping.xy);
gl_FragColor = texel;
}
I am calculating the color map row using (in my Java code):
float colormapIndex = (32 + 0.5f) / colorMap.getHeight();
which is then set as the batch color's r (a_color.r in vertex shader) argument. In the above example, 32 would be the color map row I need. However, I noticed that about 5 in 6 values don't show up, i.e., it takes at lease an increment of this many to see the colormap row change.
Does anyone know of a better way to handle this? I think that either an increase in precision is needed, or I need some to pass the color map row index rather than what I'm doing right now.