I have a set of data, which is 256x256x18 byte (means 256 in width, 256 in height and every "pixel" is 18 byte). And I want to render it to a 256x256 normal RGBA picture.
I known how to do this in CPU.
Besides, I have learned how to use texture2D
to do some pixel work with normal RGBA pictures.
I wonder if that data can be used as a special "texture"? If so, how to store and sample it with openGL/glsl ?
//edit 20190426
some details about render every fragment:
here is the structure of 18-BYTE "pixel"
struct Element {
struct Layer layer1;
struct Layer layer2;
struct Layer layer3;
struct Layer layer4;
uint16 id2;
}
struct Layer {
uint8 weight;
uint16 id1;
uint8 light; //this will not be used
}
Meanwhile, there is a color table of about 5000 colors for id1
and a table of about 30 colors for id2
;
The render algorithm is something like this:
RGBATuple renderElement(Element e) {
RGBATuple c = colorTable2[e.id2];
c = mix(colorTable1[e.layer4.id1], c, e.layer4.weight);
c = mix(colorTable1[e.layer3.id1], c, e.layer3.weight);
c = mix(colorTable1[e.layer2.id1], c, e.layer2.weight);
c = mix(colorTable1[e.layer1.id1], c, e.layer1.weight);
return c;
}
The data are read from file or received from network.All the Element
are formed as an picture (2D matrix),they filled the first line from 0 t0 255, the second line from 256 to 511,...,that is it.