I'm probably missing something obvious, but i'm experimenting with gpu.js and getting some strange results. I just want to make sure i'm not doing something obviously stupid (which is likely).
Not sure if this is an issue with what i'm doing, or the way in which calculations are performed when done via gpu.js using WebGL.
I create a new GPU and new kernel:
const gpu = new GPU();
const test = gpu.createKernel(function () {
return 255 +
(255 * 256) +
(255 * 256 * 256) +
(255 * 256 * 256 * 256);
}).setOutput([1]);
const res = test();
This gives me a result of 4294967296 (contained in a float32array).
If i run the same calculation from the console i get a result of 4294967295.