1

I wrote a small python script using diplib to measure the center and the roation angle of our product. I wise the output should be as precise as possible since I relied on this to drive an industrial PNP machine to precisely place a panel on it. Here is my code


import diplib as dip
import numpy as np
import matplotlib.pyplot as pp

# load

img = dip.ImageRead('back6.png')
img = img(1)

gm = dip.Norm(dip.GradientMagnitude(img))

water = dip.Watershed(gm, connectivity=1,maxDepth=6,flags={'correct', 'labels'})
water = dip.SmallObjectsRemove(water, 8000)

blocks = dip.MeasurementTool.Measure(water, img, features=['Mass'])
blocks = dip.ObjectToMeasurement(water, blocks['Mass'])

blocks[blocks < 31000000] = 0

result = dip.MeasurementTool.Measure(water, img, features=['Center'])
print(result)


board = (blocks > 3100000)

board = dip.FillHoles(board)
rect = dip.Opening(board, 35)

dip.viewer.Show(board)
dip.viewer.Show(rect)


result = dip.MeasurementTool.Measure(dip.Label(board), img, features=['Center', 'Feret'])
print(result)


result = dip.MeasurementTool.Measure(dip.Label(rect), img, features=['Center', 'Feret'])
print(result)


target = dip.Overlay(img, rect-dip.BinaryErosion(rect, 1, 1))
dip.viewer.Show(target)


dip.viewer.Show(dip.Overlay(img, board-dip.BinaryErosion(board, 1, 1)))


circle = dip.Image(target.Sizes(), 1, 'SFLOAT')
circle.Fill(0)

dip.DrawBandlimitedBall(circle, diameter=10, origin=result[1]['Center'])

circle /= dip.Maximum(circle)

target *= 1 - circle
target += circle * dip.Create0D([255,10,0])

dip.viewer.Show(target)

And so is my test image enter image description here

This code tracing the contour looks good and the center measurement is stable enter image description here

Then I use 'Feret.MinAng' to measure the rotation angle, but according to the issue on github it will find the two pixels in either side of the object that are farthest away from each other as the width, rather than finding the average distance between opposite sides, you said that Feret is bias. This makes me extremely worried about the repeatability of my code. So I apply an morphological opening on the target round corner rectangle, and the 'Feret.MinAng' show about 0.05 degree error. See the following screenshot enter image description here

How can I make the rotation angle measurement stable and consistence with slightly light source variance?

Cris Luengo
  • 55,762
  • 10
  • 62
  • 120
HotCat
  • 55
  • 6
  • There are different ways you could approach this. One way would be to extract the boundary as a polygon, simplify it, then compute the Feret properties from that (DIPlib has all of these implemented). A more precise approach likely would involve averaging the gradient orientation near the edges of the rectangle. Maybe a weighted histogram of the angles (weight would be the gradient magnitude, stronger gradient has a more reliable orientation), where you would find the two modes (90 degrees apart) and average them. – Cris Luengo Jun 10 '23 at 01:32
  • I can see dip::Polygon, dip::ConvexHull, dip::ChainCode in diplib documents, but I can not figured out how to use it. Could you give an example? – HotCat Jun 10 '23 at 05:32
  • I also find the similar scenario [link](https://stackoverflow.com/questions/70560162/measuring-the-distance-between-two-lines-using-diplib-pydip), how far is it related to my problem? – HotCat Jun 10 '23 at 05:36
  • I don’t know if I can find time today to write an example. I’ll post an answer eventually, but look at [`GetImageChainCodes`](https://diplib.org/diplib-docs/measurement.html#dip-GetImageChainCodes-dip-Image-CL-dip-UnsignedArray-CL-dip-uint-) to get the chain code. The `Polygon` method of the chain code converts it to a polygon. The polygon object has a `Simplify` method. You then do `ConvexHull().Feret()`. – Cris Luengo Jun 10 '23 at 13:43
  • very thorough explaination on how to use ChainCode, I'll try it myself. Enjoy your weekend. I think method 1 is enough for my application because I heard openPNP project only uses openCV approxpolydp and boundingrect, those function is not designed for subpixel precision. But I still curious about your more precise approach so that I can trade off between the speed and precision – HotCat Jun 10 '23 at 19:39

1 Answers1

1

In comments I suggested the following quick and easy solution:

cc, = dip.GetImageChainCodes(+board)
pol = cc.Polygon()
pol.Simplify(2)
angle = pol.ConvexHull().Feret().minAngle
print(angle)

Here we manually do what dip.MeasurementTool.Measure() does for the Feret feature, but we add a call to the Simplify() method of the polygon, which applies the Douglas-Peucker algorithm to remove vertices from the polygon that are most likely caused by noise.


A more elaborate method is as follows: We use the Structure Tensor to estimate locally the orientation of the gradient. Within pixels close to the edge of the detected rect, we sample the estimated orientations and create a histogram. We expect this histogram to have two peaks: one for the horizontal edges of the rectangle, one for the vertical ones. We now determine the location of these peaks, advance once by 90 degrees, and average them together. We're now using the orientation estimates for all pixels around the shape to estimate the orientation of the shape.

orientation, = dip.StructureTensorAnalysis(dip.StructureTensor(img), ["orientation"])
edge = dip.MorphologicalGradientMagnitude(board)
hist = dip.Histogram(orientation, edge)
hist.Show()

maxima = dip.SubpixelMaxima(hist.GetImage())
from operator import attrgetter
maxima.sort(key=attrgetter('value'))
maxima = [maxima[-1].coordinates[0], maxima[-2].coordinates[0]]
maxima = np.interp(maxima, np.arange(hist.Bins()), hist.BinCenters())
maxima[0] += np.pi/2  # the two estimates are pi/2 apart
maxima = np.mod(maxima, np.pi)
angle = np.mean(maxima)
print(angle)

Because we don't know which of the two values in maxima is which, we advance an arbitrary one by pi/2, then apply the modulo operator to bring the two values back to the [0,pi] range. The two values should be very close together now, if the shape is a rectangle with 90 degree angles.

And so, we're getting a single orientation, which could be either from the horizontal edges or from the vertical edges, depending on which of the two we advanced by pi/2. Note that if the value is close to pi, it is a small negative rotation (the orientation has a periodicity of pi).

Cris Luengo
  • 55,762
  • 10
  • 62
  • 120
  • I found both methods work extremely well after I adjust camera 'Gamma' parameter to middle range. Whenever how I slightly adjust the exposure, contrast and light source power, method 2 can reach average 0.005 degree error while method 2 can reach 0.03 degree error. Nice solution, thanks very much – HotCat Jun 13 '23 at 10:27