I'm trying to figure out a good method for comparing two images in terms of their color. One idea I had was to take the average color of both images and subtract that amount to get a "color distance." Whichever two images have the smallest color distance would be a match. Does this seem like a viable option for identifying an image from a database of images?
Ideally I would like to use this to identify playing cards put through an image scanner.
For example if I were to scan a real version of this card onto my computer I would want to be able to compare that with all the images in my database to find the closest one.
Update: I forgot to mention the challenges involved in my specific problem.
- The scanned image of the card and the original image of the card are most likely going to be different sizes (in terms of width and height).
- I need to make this as efficient as possible. I plan on using this to scan/identify hundreds of cards at a time. I figured that finding (and storing) a single average color value for each image would be far more efficient than comparing the individual pixels of each image in the database (the database has well over 10,000 images) for each scanned card that needed to be identified. The reason why I was asking about this was to see if anyone had tried to compare average color values before as a means of image recognition. I have a feeling it might not work as I envision due to issues with both color value precision and accuracy.
Update 2: Here's an example of what I was envisioning.
Image to be identified = A
Images in database = { D1, D2 }
average color of image A = avg(A) = #8ba489
average color of images in database = { #58727a, #8ba489 }
D2 matches with image A because #8ba489 - #8ba489 is less than #8ba489 - #58727a.
Of course the test image would not be an exact match with any of those images because it would be scanned in; however, I'm trying to find the closest match.