I have a set of [x, y, time] values and a reference point [x, y]. The values represent 2D movement of an object over time. I would like to approximate the time at which the trajectory of the object was closest the reference point. In this diagram, the blue crosses are the set of points and the red cross is the reference point.
My initial approach would be to derive the linear trajectory of the object (green line below), and then find the point on the line that forms a right angle with the reference point (PT). The distance of that point between its two nearest neighbours along the trajectory line (T2 and T3) would provide an interpolated time value.
Is there a more efficient algorithm, or set of algorithms, for calculating the time value of PT? A linear approximation for trajectory (as opposed to a spline) is acceptable, as is some tolerance in the accuracy of the calculated time. A reference implementation (in pseudo-code/C/Java/whatever) for study would be most welcome. Many thanks.
EDIT: maybe it's easier to re-phrase in terms of GPS. Given a polyline of GPS points and the time of their reading, at what time precisely did the path go closest to this other point?