1

I am trying to track the rotation of a sphere by tracking surface features on the sphere.

TL;DR: Am I wrong averaging quaternions?

Introduction

Let's skip over the feature detection and matching, it's just ORB+FLANN from OpenCV. This gives me two sets of (matched) keypoints for two successive frames.

I project the keypoints into 3d by placing them on the sphere according to their distance to the spheres center 1. This gives me the keypoints in the spheres coordinate system.

Flow vectors (never mind the big vectors for now)

From the matched keypoints as vectors I calculate a set of quaternions that represent the rotation since the last frame (algorithm used).

a = np.cross(v1,v2)
w = np.sqrt(np.linalg.norm(v1)**2 * np.linalg.norm(v2)**2) + np.dot(v1,v2)
return quaternion(w, a[0], a[1], a[2]).normalized()

Problem

So at this point my idea is that all of those quaternions represent the same rotation (plus/minus noise). So in theory I should be able to calculate the mean quaternion of my quaternion set and have a good idea of how my sphere moved since the last frame (algorithm used).

np.linalg.eig(np.mean(np.array([np.outer(q,q) for q in quats]), axis=0))[0]

Thing is that this results in unit quaternions for all frames (or close by, like e^-06 close):

quaternion(0.999999999865224, -4.24742027709118e-06, 1.4799762763963e-05, 5.69900765792268e-06)

("naive" averaging over all quaternions actually produces a results that kind of looks like it might fit with the original rotation, but I would rather use a proven method)

So, I have two theories:

  1. Having several thousand quaternions in this leads to catastrophic cancellation in the summation and eigen vector calculations. Or, (more probable):
  2. I have an error in my thinking.

Thoughts?

1 I am aware that I have to deal with the cameras projection, but for a proof of concept I chose to ignore that.

fho
  • 6,787
  • 26
  • 71

1 Answers1

1

What’s your question? It’s unclear what are you trying to find.

However, whatever it is, your algorithm is clearly wrong. Movement of 1 point on the sphere doesn’t uniquely define sphere rotation.

my idea is that all of those quaternions represent the same rotation

They aren’t. You use a formula to find quaternion from movement of a single point. The rotation found by that formula assumes the point was moved along a great circle.

When you rotate a sphere, only points on the equator (defined by the rotation axis) move along the great circle, the rest of the points move along other, more interesting curves.

Update: As far as I understood, you have noise, tracking artifacts, and other shenanigans. One approach could be using numerical methods to find Euler angles (not quaternions, quaternions need to be normalized i.e. they have unneeded degree of freedom for the solver) directly from the points. Maybe DownhillSolver or ConjGradSolver from CV will work for you. Might be useful to run it twice, find rotation, then drop 10-20% of points with the worst predictions, then solve again on the good 80-90% of the points.

Update 2 see the pic. enter image description here Green lines are actual movements of two points as a part of sphere rotation. Obviously, they rotate around the same axis, but not along great circles. Red lines are rotations along great circles between these pairs of points, found by your incorrectly applied formula. Blue dashed lines are axes of red rotations. I hope you now understand why you can’t apply the formula.

Soonts
  • 20,079
  • 9
  • 57
  • 130
  • The "great circle" thing seems to be the key here. If the algorithm used the key points, and a point on the rotation axis, in the plane perpendicular to the axis and containing the two key points, then you'd get a unique rotation for the points (or diff) in question. The puzzling part is that the angle is going to be the same for each point pair regardless of the distance from the equator, as long as it's in a plane perpendicular to the rotation axis. :/ I hate quats. – 3Dave Sep 06 '18 at 21:41
  • @fho see the update. Also this https://stackoverflow.com/a/38458325/126995 for how to not waste too much CPU time doing that, especially `XMScalarSinCosEst`. – Soonts Sep 06 '18 at 22:19
  • I am not sure about the great circle thing. @3Dave points out that all rotations (in 3d) should have the same axis of rotation. In a small time delta the sphere can't rotate over more than one axis (can it?). Maybe I should try to work on the axis angles directly. Will see how that turns out. Regarding noise: there is noise and false key points, but I did a decent job removing those. – fho Sep 07 '18 at 07:30
  • I just tried that and found that the rotation axis are all over the place, I would have assumed that they at least somewhat resemble the "true" rotation axis of the sphere. Could be that this is due to frustum distortion now that I think of it. I'll see if removing that helps, but I have to do some new recordings for that. – fho Sep 07 '18 at 07:53
  • @fho It’s not frustum, you just can’t apply that formula. See another update. – Soonts Sep 07 '18 at 11:26