Hello everyone,
Coming from the web dev world. I am currently trying to do some C code which is converting an RGB value to an XYZ value that can be used by NodeJS through N-API. The issue that I have is regarding to float calculation. Below is the explanation of my issue:
Based on this C code below, this code is trying to convert an RGB value to an XYZ value.
char * colorType = getStringValue(env, funcParams[2], SPACELen);
// m is either the value srgb | adobeRgb
Matrix m = getEnumFromStr(colorType);
Rgb * rgb = getRGBFromJSObj(env, funcParams[0]);
xyz = generateXyzFromRgb(rgb, m);
And I am using this JS snippet to call my library
const rgb = {
r: 255,
g: 255,
b: 255
};
const xyz = lib.getXyzFromRgb(rgb, "srgb", 10000);
expect(xyz).to.be.deep.equal({
x: 0.9504,
y: 1,
z: 1.0888
});
If everything should be all right the output should be like the one below
{
x: 0.9504,
y: 1,
z: 1.0888
}
However the output that I have is this one
{
x: 0.9502,
y: 0.9997,
z: 1.0886
}
As you can see the output is totally wrong. However this wrong output only happened on my local machine (OSX) and only when I am trying to do the conversion by using the JS snippet.
Indeed, when I am trying to run the conversion with this piece of code below directly through Xcode the output is correct
// RGB and & m variable is outputing the same value as the conversion done by the C code above
xyz = generateXyzFromRgb(rgb, m);
Moreover when I'm trying to call the JS code through travis which also running OSX and on Ubuntu through Docker the JS code also output the right value.
Can it be more related to the hardware or the way I am compiling my libraries or else ?
Thank you in advance.