0

After computing the XYZ gamut bounding mesh below from spectral samples/color matching functions, how does one scale the resulting volume for compatibility with popular color spaces such as sRGB? More specifically, the size and scale of the volume depends on the number of samples and the integral approximation method used to compute it. How, then, can one determine the right values to scale such volumes to match known color spaces like sRGB, P3-Display, NTSC, PAL, etc?

oddly scaled CIE XYZ color gamut bounding volume.

It seemed like fitting the whole volume so that Y ranges from [0, 1] would work, but it had several problems:

  1. When compared to a sub-volume generated by converting the sRGB color cube to XYZ space, the result protruded outside of the 'full gamut'.sRGB->XYZ volume protruding beyond the XYZ full gamut.
  2. Converting random XYZ values from the full gamut volume to sRGB and back, the final XYZ doesn't match the initial one.

Most (all?) standardized color spaces derive from CIE XYZ, so each must have some kind of function or transformation to and from the full XYZ Gamut, or at least each must have some unique parameters for a general function.

How does one determine the correct function and its parameters?

Ben McKenneby
  • 481
  • 4
  • 15
  • Maybe you can rephrase your question to make it clear what you are trying to achieve? In particular, I am puzzled by what you mean by "normalization". It seems you are referring to gamut mapping. In general, you cannot expect spectral samples to fit in sRGB. This might explain your observations. – Simon Thum May 17 '22 at 20:17
  • Thanks! You're right. Spectral samples don't fit within sRGB. Rather, sRGB should fit within the full gamut generated by spectral samples. By normalize, I mean, find the correct maximum values for x, y, and z so that the spectral -> XYZ gamut volume accurately encloses the sRGB gamut volume. – Ben McKenneby May 17 '22 at 20:22
  • @SimonThum, You're right. Normalize wasn't the right word. Maybe 'scale' captures the concept more accurately. – Ben McKenneby May 17 '22 at 20:30
  • @SimonThum I've added another image depicting one of the issues. – Ben McKenneby May 17 '22 at 22:55

2 Answers2

2

Short answer

If I understand your question, you are trying to accomplish is determining the sRGB gamut limits (boundary) relative to the XYZ space you have constructed.

Longer answer

I am assuming you are NOT trying to accomplish gamut mapping. This is non-trivial, and there are multiple methods (perceptual, absolute, relative, etc). I'm going to set gamut mapping aside, and instead focus on determining how some arbitrary color space fints inside your XYZ volume.

First to answer your granular questions:

After computing the XYZ gamut bounding mesh below from spectral samples, how does one scale the volume for compatibility with popular color spaces such as sRGB?

What spectral samples? From a spectrophotometer reading a test print under a given standard illuminant? Or where did they come from? A color matching experiment?

The math is a matter of integrating the spectral data to form the XYZ space, which you apparently have done. What illuminant (white point)??

It seemed like fitting the whole volume so that Y ranges from [0, 1] would work, but it had several problems:

Whole volume of what? The sRGB space? How did you convert the sRGB data to YXZ? OR is this really the question you are asking?

What are the proper scaling constants?

They depend on the spectral data and the adapted white point for the spectral data. sRGB is D65. Most printing is done using D50.

Does each color space have its own ranges for x, y, and z values? How can I determine them?

YES.

Every color space has a different transformation matrix depending on the coordinates of the R G and B primaries. The primaries can be imaginary, such as in ProPhoto.

Some Things

The math you are looking for you can find at brucelindbloom.com and also, you might want to check out Thomas Mansencal's ColorScience, a python library that's the swiss-army-knife of color.

sRGB

XYZ is a linear light space, wherein Y = 0.2 to Y = 0.4 is a doubling of luminance.

sRGB is not a linear space, there is a gamma curve or tone response curve on sRGB data, such that rgb(20,20,20) to rgb(40,40,40) is NOT a doubling of luminance.

The first thing that needs to be done is linearize the sRGB color data.

Then take the linear RGB and run it through the appropriate matrix. If the XYZ data is relative to a different adapting white point, then you need to do something like a Bradford transform to convert to the appropriate one for your XYZ space.

The Bruce Lindbloom site has some ready-to-go matrixes for a couple common situations.

The problem you are describing can be caused by either (or both) failing to linearize the sRGB data and/or not adapting the white point. And... possibly other factors.

If you can answer my questions regarding the source of the spectral data I can better assist.

Myndex
  • 3,952
  • 1
  • 9
  • 24
  • 1
    "... are [you] trying to accomplish is determining the sRGB gamut limits (boundary) relative to the XYZ space you have constructed?" More like trying to unify the gamut limits of various standard color spaces with an 'objective one' constructed with Wavelength -> XYZ samples. For example, given the illuminants and transfer functions associated with sRGB, Adobe RGB 1998, Display P3, etc, I can determine their limits by simply transforming a mesh representation of the RGB color cube into XYZ space point by point. However, these XYZ volumes clearly assume a differently scaled XYZ volume. – Ben McKenneby May 18 '22 at 01:10
  • "I am assuming you are NOT trying to accomplish gamut mapping." Sound assumption. – Ben McKenneby May 18 '22 at 01:11
  • 1
    "What spectral samples? From a spectrophotometer reading a test print under a given standard illuminant? Or where did they come from? A color matching experiment?" Excellent question. I appreciate your familiarity with the subject! I curated a blend of these: http://www.cvrl.org/database/data/cienewxyz/lin2012xyz2e_5_7sf.htm which I believe were derived from CIE 2006 color matching experiments. My curation involved swapping some of the values in that table with the extreme values from higher resolution tables from the same source. – Ben McKenneby May 18 '22 at 01:19
  • "The math is a matter of integrating the spectral data to form the XYZ space, which you apparently have done. What illuminant?" I used the square pulse method described at brucelindbloom.com, other stack overflow posts, and demonstrated here: https://github.com/mjhorvath/Datumizer-Wikipedia-Illustrations/tree/master/Color/cie_color_solids. My math didn't use an illuminant until I added a parameter for the version that produced the image in the self-answer below. Does that mean that I used the "equal energy white" described here: https://isometricland.net/blog/pov-ray-projects/ ? – Ben McKenneby May 18 '22 at 01:27
  • "Whole volume of what?" After computing the XYZ volume from λ -> XYZ samples, the scale of the resulting volume depends largely on the number of samples used to produce it; more samples means larger volume. Initially, I just normalized each x, y, and z component to [0, 1] but that's definitely not what real color spaces actually use. I initially thought that fitting the spectrally generated space so that its y values range from [0, 1] sounded sensible, but that wasn't correct, as demonstrated by the second visualization in the question. – Ben McKenneby May 18 '22 at 01:33
  • "Every color space has a different transformation matrix depending on the coordinates of the R G and B primaries. The primaries can be imaginary, such as in ProPhoto." It sounds like I should use the chromatic primaries as well as the illuminants to derive the scaling constants? Does that mean summing the three primaries together to find the maximum values of each X, Y, and Z? Or is it more subtle? – Ben McKenneby May 18 '22 at 01:39
  • 1
    @BenMcKenneby WOOF! This is not the usual Stack question thank you for the answers, I need to review some of this before I continue, this is esoteric and literally not a question I've encountered before. I want to see if I can create the transform matrix ... or I should say, what you want to end up with is a 3x3 matrix to transit from RGB to XYZ for each colorspace. Since you made your own (non standard) XYZ from the 2006 data (cool BTW) I'm not certain you can use the primary coordinates of color spaces as plotted relative to the CIE 1931 space. I'll be back. – Myndex May 20 '22 at 01:32
  • I think you're right about trying to mix non-standard spectral samples. Surely CIE 2006 can't perfectly correspond to sRGB which predates it, however, without knowing the exact spectral sample data used to define each color space, how much better can I do? Are most color spaces based on CIE 1931? – Ben McKenneby May 20 '22 at 12:57
  • 1
    As @Myndex points it out rightly so, you cannot use any non-spectrally defined RGB colourspace in any other observer than the CIE 1931 2 Degree Standard Observer which most of them are defined against. A notable exception being ITU-R BT.2020 that has pure laser-like primaries. – Kel Solaar May 20 '22 at 23:47
  • I think it's mostly working: https://colortree.net/gamut/viz.html?ws=sRGB&space=XYZ Any thoughts? – Ben McKenneby Jun 05 '22 at 05:06
  • @BenMcKenneby Dude, that's rad! Nice work... my one question or thought is, regarding normalizing the spaces such that the apparent size of each is roughly equivalent for some (arbitrary? perhaps user selected) coordinate. Probably non-trivial as some of the spaces are more non-linear than others, like LUV which multiplies the uv coords with 13L*.... but would be interesting to then compare meshes of one against another (just thinking out loud)... Very cool and responsive tool... – Myndex Jun 05 '22 at 16:54
  • 1
    Do you mean show the color spaces next to each other to emphasize their differences scale? e.g. CYM next to L*u*v* ? While debugging in blender, I found that in order to fit the L* spaces onto the screen, the others only take up a couple of pixels, however, that definitely makes the point about their differences in scale. – Ben McKenneby Jun 05 '22 at 17:29
  • @BenMcKenneby I was thinking similar to the image in your post, with the mesh of one space superimposed on the colored contour of the other. And yes, the CIE spaces are big... but, if a specific color was chosen as a reference point in addition to white, then using that to scale the mesh, instead of scaling relative to white and black per se.... – Myndex Jun 05 '22 at 19:58
  • @Myndex sorry for the slow reply, I had to scratch my head a long time to realize that your suggestion went over it. Do you mean that you'd like to compare color space volumes by aligning them to each other through affine transformations defined by 2 or three specific reference colors? I can imagine forming a triangle out of 3 colors common to each space, then computing an affine transformation that maps one triangle to the other, then applying that transformation to the entire mesh in order to visualize how the spaces diverge from that perspective? Or did you mean something else? – Ben McKenneby Jun 09 '22 at 18:31
  • Hi @BenMcKenneby Yes, I meant exactly that, mostly... it seems like the best "red apples green apples" comparison. For instance, in LUV, the chroma magnitude makes the volume appear huge, but if you correct such that red #f00 occupies the same point in space, and then the vertical extent is constrained by black and white, we'd have a more reasonable comparison. Or, use white, red, blue... or black, green and magenta... etc... The complicated question is which axis to constrain absolutely, which axis is constrained in a range, and which axis has no constraints (if any, or in some combo...) – Myndex Jun 09 '22 at 21:41
  • @Myndex I've thought about this a bit, but all of my ideas suffer from occlusion issues. It's hard to imagine a way to visualize a comparison between two overlapping 3D volumes, but maybe by making both wireframes except for the triangle of intersection could tell us something. The triangle itself could use alpha transparency to communicate the similarity. I guess the corners would render more opaquely and internal areas that don't match could render more transparently. Would that be useful? – Ben McKenneby Jun 16 '22 at 14:47
  • I also edited the question. If you'd like to tack on a short answer about scaling to the illuminant, I'll make yours the accepted one. – Ben McKenneby Jun 16 '22 at 14:48
0

Further research and experimentation implied that the XYZ volume should scale such that { max(X), max(Y), max(Z) } should equal the illuminant from the working space. In the case of sRGB, that illuminant (also called white point) is called D65.

Results look convincing, but expert confirmation would still be appreciated.

enter image description here

Ben McKenneby
  • 481
  • 4
  • 15
  • 1
    Heh, I started writing my answer before you posted this, and just saw this as I posted, LOL... still my first question is what is the source of the spectral data, and adapting white point for said data? – Myndex May 18 '22 at 00:39
  • 1
    @Myndex Thanks for engaging! I'll answer your questions in comments posted to your thoughtful reply, but some notes here: brucelindbloom.com inspired this whole undertaking. Although the volume comparison depicted in this answer looks plausible for sRGB D65 with the usual ICC V2 transfer function and Apple RGB with Gamma(1.8), volumes for other spaces like Adobe RGB still protrude through the full gamut. Direct illuminant scaling might not have solved the problem unless we expect Adobe and others to include colors outside of human perceptible ranges. – Ben McKenneby May 18 '22 at 00:59
  • Considering: http://brucelindbloom.com/index.html?WorkingSpaceInfo.html#GamutProjections It looks like the RGB -> XYZ gamuts do indeed protrude outside of the full gamut volumes, but that doesn't imply that I've scaled the spectrally constructed XYZ volume correctly. Any thoughts? – Ben McKenneby May 18 '22 at 01:56
  • 1
    HI Ben, so you'll notice on that page you linked, the gamut projection for sRGB is well within the confines... but some like ProPhoto are not... and this is because ProPhoto uses imaginary primaries - primaries that are outside the spectral locus and therefore are impossible to create in the real world. – Myndex May 20 '22 at 00:57
  • 2
    Sorry for being late, but this sounds correct. Scaling to the target whitepoint should be close enough to gamut mapping with a relative colorimetric intent, and is likely to yield a sRGB compatible output. However, X,Y and Z _are_ imaginary primaries, thus it's common to work with imaginary primaries. This was chosen by CIE since back in the days, it was considered computationally advantageous to work with positive values only, which is not possible with real-world primaries. Depending on your intent, CIE RGB (i.e. real primaries) may be preferable. – Simon Thum May 20 '22 at 08:14