I've been doing my due diligence on contrast stretching, but have a special case.
If you have an image that has pixel values from 3 to 248, but you only want to stretch, say, pixels of range 105 to 135, to 10 to 255, and zero out whatever is below 10, what might be the most efficient way to do this?
My current method which works is below. Is there a better way or built-in function which I'm missing, in either OpenCV or with Numpy?
I know OpenCV has a slightly different way of handling addition/subtraction/muliplication:
OpenCV image subtraction vs Numpy subtraction'
... and was wondering if that would be more efficient.
# raw_image has these limits for intensity
min=3
max=248
# scale this range so 105-->10, and 135-->255
# anything less than 105 is zero, anything greater than 255 is 255
min_x_zoom = 105
max_x_zoom = 135
# subtract off min_x_zoom, making it zero
image_being_processed=np.subtract(raw_image,min_x_zoom)
# scale that desired range
image_being_processed = np.multiply(image_being_processed , float((255.0 - 10.0) / (max_x_zoom - min_x_zoom)))
# add 10 to all pixels; values will now be 10 to 255, however, some might be less than 10
image_being_processed=np.add(image_being_processed,10.0)
# if the value is less than 10, make it zero
image_being_processed=np.where(image_being_processed<10,0,image_being_processed)
# clip image, so we don't go outside the range of 0 to 255
image_being_processed=np.clip(image_being_processed,0,255)
# convert to integers for display purposes
image_being_processed = np.array(image_being_processed, dtype = np.uint8)