Is there a fast a way to interpolate "mesh calibration" to a bunch of points? Imagine a precise but inaccurate 2 axis cartesian robot that can move to any X,Y coordinate within its work area.
As a simple example, when the robot is instructed to move to
[0, 0], [5, 0], [10, 0]
[0, 5], [5, 5], [10, 5]
[0, 10], [5, 10], [10, 10]
the actual recorded positions are
[0.1, 0.2], [4.9, 0.2], [10.0, 0.1]
[-0.1, 5.2], [5.0, 5.3], [10.1, 5.1]
[-0.2, 10.0], [4.8, 9.9], [9.9, 9.8]
Thus, an "error grid" shown below is obtained
[0.1, 0.2], [-0.1, 0.2], [0.0, 0.1]
[-0.1, 0.2], [0.0, 0.3], [0.1, 0.1]
[-0.2, 0.0], [-0.2, 0.1], [0.1, 0.2]
Using this "error grid", how can I apply a correction to the robot input to minimize the error at any arbitrary point? For example, when I instruct it to go to [7, 8], it should look up the error measured at [5,5], [10, 5], [5, 10], [10, 10] to estimate the amount of correction needed.
I'm sure someone out there has figured this out 50 years ago but I just can't think of what keyword to google.
Many thanks.