I want to use a base64-encoded png that I retrieve from a server in WebGL. To do this, I load the encoded png into an html Image object. For my application, I need the png data to be absolutely lossless, but the retrieved pixel values by the shader are different in different browsers... (if I load the Image into a canvas and use getImageData, the retrieved pixel values are different across browsers as well). There must be some weird filtering/compression of pixel values happening, but I can't figure out how and why. Anyone familiar with this problem?
Loading the image from the server:
var htmlImage = new Image();
htmlImage.src = BASE64_STRING_FROM_SERVER
Loading the image into the shader:
ctx.texImage2D(ctx.TEXTURE_2D, 0, ctx.RGB, ctx.RGB, ctx.UNSIGNED_BYTE,
htmlImage);
Trying to read the pixel values using a canvas (different values across browsers):
var canvas = document.createElement('canvas');
canvas.width = htmlImage.width;
canvas.height = htmlImage.height;
canvas.getContext('2d').drawImage(htmlImage, 0, 0, htmlImage.width,
htmlImage.height);
// This data is different in, for example, the latest version of Chrome and Firefox
var pixelData = canvas.getContext('2d').getImageData(0, 0,
htmlImage.width, htmlImage.height).data;