94

I'm looking for an algorithm that compares two RGB colors and generates a value of their similarity (where similarity means "similar with respect to average human perception").

Any ideas?

EDIT:

Since I cannot answer anymore I decided to put my "solution" as an edit to the question.

I decided to go with a (very) small subset of true-color in my app, so that I can handle comparison of colors by my own. I work with about 30 colors and use hard-coded distances between them.

Since it was an iPhone app I worked with objective-C and the implementation is more or less a matrix representing the table below, which shows the distances between the colors.

enter image description here

Peter O.
  • 32,158
  • 14
  • 82
  • 96
Kai Huppmann
  • 10,705
  • 6
  • 47
  • 78
  • @Kai: i am trying to implement the same thing. did you go with the YUV approch or did you choose another kind of color space and space distance? – Thariama Sep 18 '12 at 14:18
  • 3
    @Thariama I decided to go with a (very) small subset of true-color in my app, so that I can handle comparison of colors by my own. I work with about 50 colors and use hard-coded distances between them. However from all I read and tried and tested when using 2^24 colors YUV did the best job. – Kai Huppmann Sep 18 '12 at 15:15
  • @Kai: thanks very much for letting me know about your decision and its reasons. that means you are using RGB and create a histogram using 50 colors and speed up your alogithm using predefined distances, correct? what language did you use to implement your algorithm? – Thariama Sep 19 '12 at 06:43
  • @Thariama Put it as an edit in the original question. – Kai Huppmann Sep 19 '12 at 08:45
  • @Kai: thanks very much, too bad there is not a second "+1"-button! – Thariama Sep 19 '12 at 09:11
  • @Kai: to make your matrix work you will need to transform eachj of the colors of the source image to one of the 27 colors you use. how do you do that? – Thariama Sep 19 '12 at 12:35
  • How do you know I deal with images? The question's just about comparing colors? However, yes, I somehow compare images to what the user drew with the colors... But I'm not able to compare any images. I have a finite set of images in a database and all of those have also representations in the DB wich are drawn with the app and the comparison algorithm does not compare the image itself to the users input, but it's simple representation. – Kai Huppmann Sep 19 '12 at 12:48
  • Similarity of colors is in most cases used to compare colors or an image with another image, that's why i guessed you were dealing with images. So you are not taking an image and adjust its colors to your color pallette of 27 colors? – Thariama Sep 20 '12 at 07:56
  • No, I don't. And I think, if you want to, you end up with the same problem you started with: What color of the subset is best match to a 24-bit color used in an image?... – Kai Huppmann Sep 20 '12 at 08:50
  • There is another related question that has a really nice answer [here](http://stackoverflow.com/a/7507452/340947) – Steven Lu Aug 05 '14 at 10:25
  • To correct my comment above: 90/90/90 s closer to black than 155/0/0. – Kai Huppmann Dec 15 '14 at 10:28
  • A tip here http://hanzratech.in/2015/01/16/color-difference-between-2-colors-using-python.html . The code relies on colormath.color_diff import.delta_e_cie2000 The calculation is quite slow in my tests, but it is probably the best approach. – Larytet Sep 16 '19 at 13:18
  • @Larytet Thx! Looks interesting (even though I don't think very much about this topic anymore ;)) – Kai Huppmann Sep 16 '19 at 13:48

6 Answers6

63

RGB distance in the euclidean space is not very similar to "average human perception".

You can use YUV color space, it takes into account this factor :

 |  Y' |     |  0.299     0.587    0.114   | | R |
 |  U  |  =  | -0.14713  -0.28886  0.436   | | G |
 |  V  |     |  0.615    -0.51499 -0.10001 | | B |

You can also use the CIE color space for this purpose.

EDIT:

I shall mention that YUV color space is an inexpensive approximation that can be computed via simple formulas. But it is not perceptually uniform. Perceptually uniform means that a change of the same amount in a color value should produce a change of about the same visual importance. If you need a more precise and rigourous metric you must definitely consider CIELAB color space or an another perceptually uniform space (even if there are no simple formulas for conversion).

ashleedawg
  • 20,365
  • 9
  • 72
  • 105
Ghassen Hamrouni
  • 3,138
  • 2
  • 20
  • 31
27

I would recommend using CIE94 (DeltaE-1994), it's said to be a decent representation of the human color perception. I've used it quite a bit in my computer-vision related applications, and I am rather happy with the result.

It's however rather computational expensive to perform such a comparison:

  1. RGB to XYZ for both colors
  2. XYZ to LAB for both colors
  3. Diff = DeltaE94(LABColor1,LABColor2)

Formulas (pseudocode):

b0bz
  • 2,140
  • 1
  • 27
  • 27
23

There's an excellent write up on the subject of colour distances here: http://www.compuphase.com/cmetric.htm

In case that resource disappears the author's conclusion is that the best low-cost approximation to the distance between two RGB colours can be achieved using this formula (in C code).

typedef struct {
   unsigned char r, g, b;
} RGB;

double ColourDistance(RGB e1, RGB e2)
{
  long rmean = ( (long)e1.r + (long)e2.r ) / 2;
  long r = (long)e1.r - (long)e2.r;
  long g = (long)e1.g - (long)e2.g;
  long b = (long)e1.b - (long)e2.b;
  return sqrt((((512+rmean)*r*r)>>8) + 4*g*g + (((767-rmean)*b*b)>>8));
}
Stephan
  • 41,764
  • 65
  • 238
  • 329
strttn
  • 2,320
  • 1
  • 18
  • 17
16

Human perception is weaker in chroma than intensity.

For example, in commercial video, the YCbCr/YPbPr color spaces (also called Y'UV) reduces the resolution of the chroma info but preserves the luma (Y). In digital video compression such as 4:2:0 and 4:2:2 reduces the chroma bitrate due to relatively weaker perception.

I believe that you can calculate a distance function giving higher priority over luma (Y) and less priority over chroma.

Also, under low intensity, human vision is practically black-and-white. Therefore, the priority function is non-linear in that for low luma (Y) you put less and less weight on chroma.

More scientific formulas: http://en.wikipedia.org/wiki/Color_difference

Stephen Chung
  • 14,497
  • 1
  • 35
  • 48
6

Color perception is not Euclidean. Any distance formula will be both good enough and terrible at the same time. Any measure based on Euclidean distance (RGB, HSV, Luv, Lab, ...) will be good enough for similar colors, showing aqua being close to teal. But for non-close values it gets to be arbitrary. For instance, is red closer to green or to blue?

From Charles Poynton's Color FAQ:

The XYZ and RGB systems are far from exhibiting perceptual uniformity. Finding a transformation of XYZ into a reasonably perceptually-uniform space consumed a decade or more at the CIE and in the end no single system could be agreed.

xan
  • 7,511
  • 2
  • 32
  • 45
  • Thank you. And it's a great, interesting link. For my purpose it's not that important to tell if red is closer to green or blue, but that a light grey is closer to white then a light red and I hope (but not sure yet) that YUV will make it. – Kai Huppmann Mar 23 '11 at 06:43
-5

Color similarity in the RGB cube is measured by the euclidean distance (use pythagoras formula).

EDIT: On a second thought, this should be true for most other color spaces too.

Damon
  • 67,688
  • 20
  • 135
  • 185
  • 10
    No, the Euclidean distance in the RGB space does not correspond to the way the human eye perceives differences between colors. This is the entire reason that color spaces like Lab were created. – Bill Oct 11 '11 at 02:13
  • no. euclidean distance is one way to measure distance in any cartesian space. it measures distance, not similarity! now you can either pick a different vector space (like cie or yuv) where euclidean distance and similarity sortof coincide, or you use a different measure. but rgb+euclidean don't give satisfiable results. – kritzikratzi Nov 13 '13 at 14:48
  • i thought this too, but then open up a drawing program with a limited number of colors(adobe flash for example, with 216 preset default palette), and put that formula to the test and you get disappointed very very quickly, getting yellows when you clearly need brown, etc. – Dmytro Jun 21 '18 at 04:00