I want to calculate the per-row minimum of a matrix of floats in GLSL in the browser, of about 1000 rows, 4000 columns.
Building on previous answers (see this) I used a for
loop. However I would like to use a uniform for the upper bound, which is not possible in WebGL GLSL ES 1.0. This is because the length of the row is defined after the fragment shader, and I'd like to avoid messing with #DEFINE
s.
So I found out that this workaround - fixed cycle length with a if/break
defined by a uniform - works ok:
#define MAX_INT 65536
void main(void) {
float m = 0.0;
float k = -1.0;
int r = 40;
for(int i = 0; i < MAX_INT; ++i){
float ndx = floor(gl_FragCoord.y) * float(r) + float(i);
float a = getPoint(values, dimensions, ndx).x;
m = m > a ? m : a;
if (i >= r) { break; }
};
}
Now the question: does this have big drawbacks? Is there something weird I am doing and I'm missing something?