3

I have a depth image from an ifm 3D camera which utilizes a time-of-flight concept to capture depth images. The camera comes with a software which showcases the image as seen below: Depth image of 12 randomly arranged tills

I could extract the depth data from the camera and have been trying to recreate their representation, but I've been unsuccessful. No matter how I try to normalize the data range or change the data type format, I always end up with an image that is "darker" in the center and gets lighter as it moves away. The color range doesn't match either for some reason. Here's the main code I tried:

gray_dist = cv2.imread(dist_path, cv2.IMREAD_ANYDEPTH)

# cv2.normalize(dist, dist, 0, 65535, cv2.NORM_MINMAX)
# cv2.normalize(dist, dist, 0, 255, cv2.NORM_MINMAX)
cv2.normalize(dist, dist, 1440, 1580, cv2.NORM_MINMAX)

dist = dist.astype(np.uint8)
dist = cv2.applyColorMap(dist, cv2.COLORMAP_HSV)
# dist = cv2.cvtColor(dist, cv2.COLOR_HSV2BGR)

cv2.imshow("out", dist)
cv2.waitKey(0)

Which gets me the following image:

HSV color map

I've tried other combinations and also to write my own normalization and colorize functions but I get the same result. I'm not sure at this point if I'm doing something wrong or if it's a limitation of the openCV window viewer or something else.

I've also uploaded the depth image file in case it's helpful: depth_image

Any help with this would be greatly appreciated.

Grudy
  • 69
  • 1
  • 6
  • 1
    Take a look at [this](https://stackoverflow.com/questions/13840013/opencv-how-to-visualize-a-depth-image) post. – Rotem May 24 '21 at 20:26
  • 1
    in the showcase application it looks like the value range of about 1.44 - 1.58 (see the macro/micro sliders on the bottom of the window) is used only, so your normalization should ne to convert to 64 bit float, subtract 1.44, divide by 0.14 and multiply by 1 (or omit that). and apply floating point colormap (0 to 1 input range) afterwards. – Micka May 24 '21 at 21:08
  • 1
    the colormap to apply could be RAINBOW or JET (inverted): https://docs.opencv.org/master/d3/d50/group__imgproc__colormap.html – Micka May 24 '21 at 21:10
  • 1
    The image shown in the software and the sample image you provided are not the same. Does the darkness in the center show up when you view the sample image in the software? – fakedad May 25 '21 at 03:00
  • Hey sorry about the late reply. Yes it's a different image but it's only the configuration of tills under the camera that is different. The image acquisition is the same. The camera is set up exactly at a height of 1.5 meters from the table. – Grudy May 25 '21 at 05:01

1 Answers1

5

Here is one way in Python/OpenCV. It is not exact, but you can modify the colormap or change the stretch. Basically, I create a 7 color LUT with colors: red, orange, yellow, blue and violet. I note that using HSV will produce red at both ends and your color map that was used only went from red to violet. I also do not see much in the way of green. So I left that out.

Input:

enter image description here

import numpy as np
import skimage.exposure

# load image as grayscale
img = cv2.imread("dist_img.png", cv2.IMREAD_ANYDEPTH)

# stretch to full dynamic range
stretch = skimage.exposure.rescale_intensity(img, in_range='image', out_range=(0,255)).astype(np.uint8)

# convert to 3 channels
stretch = cv2.merge([stretch,stretch,stretch])

# define colors
color1 = (0, 0, 255)     #red
color2 = (0, 165, 255)   #orange
color3 = (0, 255, 255)   #yellow
color4 = (255, 255, 0)   #cyan
color5 = (255, 0, 0)     #blue
color6 = (128, 64, 64)   #violet
colorArr = np.array([[color1, color2, color3, color4, color5, color6]], dtype=np.uint8)

# resize lut to 256 (or more) values
lut = cv2.resize(colorArr, (256,1), interpolation = cv2.INTER_LINEAR)

# apply lut
result = cv2.LUT(stretch, lut)

# create gradient image
grad = np.linspace(0, 255, 512, dtype=np.uint8)
grad = np.tile(grad, (20,1))
grad = cv2.merge([grad,grad,grad])

# apply lut to gradient for viewing
grad_colored = cv2.LUT(grad, lut)

# save result
cv2.imwrite('dist_img_colorized.png', result)
cv2.imwrite('dist_img_lut.png', grad_colored)

# display result
cv2.imshow('RESULT', result)
cv2.imshow('LUT', grad_colored)
cv2.waitKey(0)
cv2.destroyAllWindows()

Stretched Image:

enter image description here

Colorized Image:

enter image description here

LUT:

enter image description here

fmw42
  • 46,825
  • 10
  • 62
  • 80
  • Your colorized image is about as close as I was able to get to the colorized image in the question. I'm wondering, though—is it not necessary to take into consideration the 1.44 and 1.58 range that is selected? – fakedad May 25 '21 at 02:56
  • What do those number represent? What scale? Is that height? What is it? To do it properly, you would need to know how that tool set up the scale and and also the exact color map that was used. – fmw42 May 25 '21 at 04:52
  • These are the min: 1426 and max: 1791 heights. So 1.426 to 1.791 thousand units (feet or meters?). So, their scale goes from 1.44 to 1.58 for red to violet. So yes, an adjustment would be needed to stretch and clip at those values. That can be done with the skimage.exposure.rescale_intensity() in_range values. – fmw42 May 25 '21 at 04:59
  • Thanks a lot for your answer fmw42. It's an approach that I haven't seen before. One thing is still unclear to me though. Why does it seem like in your image (and others I've tried) like the image has a darker center and gets lighter towards the edges? It's not seen in the software representation even though I'm acquiring the same depth data directly from the camera? – Grudy May 25 '21 at 05:03
  • 1
    I cannot answer that. I do not know what other processing they might be doing before applying the color map, nor do I know their color map colors. – fmw42 May 25 '21 at 05:08
  • That's fair. Thanks again @fmw42 – Grudy May 25 '21 at 05:10
  • After shifting the range to match the colors from about 1440 to 1580, it looks worse. So something else is going on in addition to the exact colormap. – fmw42 May 25 '21 at 05:11