I'm writing a fragment shader for rendering a 1D texture containing an arbitrary byte array into a kind of barcode. my idea is to encode each byte into a square divided diagonally (so each of the 4 triangles represents 2 bit), like so:
_____
|\ A /| each byte encoded as binary is DDCCBBAA,
| \ / | the colors are: Red if 11
|D X B| Green if 10
| / \ | Blue if 01
|/ C \| Black if 00
¯¯¯¯¯ so color can be calculated as: [(H & L), (H & !L), (!H & L)]
so for example: 198 == 11 00 01 10 would be:
_____ DD CC BB AA
|\ G /|
| \ / | A=10=Green
|R X B| B=01=Blue
| / \ | C=00=Black
|/ b \| D=11=Red
¯¯¯¯¯ (B=Blue, b=Black)
what I got so far are a function for encoding 2 bools (H,L in the example notation) into a vec3 color and a function for encoding a byte and "corner index" (A/B/C/D in the example) into the color:
#version 400
out vec4 gl_FragColor; // the output fragment
in vec2 vf_texcoord; // normalized texture coords, 0/0=top/left
uniform isampler1D uf_texture; // the input data
uniform int uf_texLen; // the input data's byte count
vec3 encodeColor(bool H, bool L){
return vec3(H&&L,H&&!L,!H&&L);
}
vec3 encodeByte(int data,int corner){
int maskL = 1 << corner;
int maskH = maskL << 1;
int shiftL = corner/2;
int shiftH = shiftL+1;
bool H=bool((data&maskH)>>shiftH);
bool L=bool((data&maskL)>>shiftL);
return encodeColor(H,L);
}
void main(void) {
// the part I can't figure out
gl_FragColor.rgb=encodeByte(/* some stuff calculated by the part above*/);
gl_FragColor.a=1;
}
the problem is I can't figure out how to calculate which byte to encode and in what "corner" the current fragment is.