0

On a job for a customer, I am locating items within a grayscale scene with nonuniform background illumination. Once the items are located, I need to do another search within each one for details. The items are easy enough to locate by masking with the output of a variance filter; and within the items, if the threshold is correct, the details are easy to locate as well. But the mean and contrast of these items varies substantially.

I played around with threshold calculation for a while, and none of the techniques I implemented is perfect; but the one that turns out simplest, as accurate as any other, and quite low cost, is to take the mean pixel value and add one standard deviation.

My question is: is there some analytical way to defend this calculation other than "it works well"? I mean, I did sort of fall on this technique accidentally (only later did I find this answer), and using it seems arbitrary.

Community
  • 1
  • 1
Mike C
  • 1,224
  • 10
  • 26
  • Before we go into this, have you investigated **adaptive thresholding** techniques? Essentially, local pixel neighbourhoods are collected and each pixel neighbourhood is individually thresholded. This escapes any poor contrast or poor illumination. However, if you want to go into defending what you have come up with.... image processing is more or less trial and error. You can justify using the mean by saying that the majority of the pixels that belong to the object and that matter should belong to a tolerance within the mean. That tolerance is defined by the standard deviation. – rayryeng Apr 06 '15 at 19:59
  • Adding one standard deviation means that you believe that roughly 68% of all of the pixels in your image are within the mean. If your objects intensities fall into this category, then you're fine. – rayryeng Apr 06 '15 at 20:02
  • @rayryeng, thanks... the 68% concept in fact maps nicely onto what I'm doing. – Mike C Apr 07 '15 at 00:13
  • @rayryeng: as for adaptive thresholding, what I've written is replacing an earlier implementation that does use some sort of adaptive thresholding. My code scores about as well for accuracy but runs in 30% of the time, and execution time is important to the customer. – Mike C Apr 07 '15 at 00:15
  • Ah I understand. OK! Well if you're fine with the definition I gave above with the 68% stuff, then that's all you need. – rayryeng Apr 07 '15 at 01:02
  • Here's a reference wikipedia page to back up what I said about the 68%. http://en.m.wikipedia.org/wiki/68%E2%80%9395%E2%80%9399.7_rule – rayryeng Apr 07 '15 at 01:05

0 Answers0