Problem
GPS projection onto a 2D surface is usually a tricky issue since the Earth's surface is curved, not flat. However, if the sample GPS data you have provided is actual data (hope the weather is pleasant in Swansea, by the way!) I will assume that the data set is confined to a very small area, and therefore you can assume that lines of longitude are parallel. The problem then becomes a simple algebraic one, and you only need two reference points where x1 != x2
and y1 != y2
.
Solution
ISO coordinates are given as (latitude, longitude) = (y, x) while plotted coordinates are given as (x, y). I'm just going to show you how to do y (latitude). We need to map the origin of the source (the screen) to the origin of the target (the world), and the scale of the source to the scale of the world. I'm going to name these variables as follows:
screenY0 //Screen origin (pixel corresponding to zero degrees latitude)
worldY0 //World origin (zero degrees latitude)
screenYscale //Screen scale (distance between 2 pixels)
worldYscale //World scale (distance between two latitude lines)
screenYpoint //Screen point (pixel y location)
worldYpoint //World point (latitude on the ground)
I'm going to use the following coordinate pairs because these are the furthest apart:
(51.606733, -3.986813) -> (246, 399)
(51.607337, -3.987266) -> (838, 781)
Our formula is going to look like this:
screenY0 + screenYscale * screenYpoint = worldY0 + worldYscale * worldYpoint.
We know that the world origin is 0, and the world scale is 1, so we can condense this to:
screenY0 + screenYscale * screenYpoint = worldYpoint.
We can plug in our values to form 2 simultaneous equations:
screenY0 + screenYscale * 399 = 51.606733
and screenY0 + screenYscale * 781 = 51.60733
Solving:
screenY0 = 51.606733 - screenYscale * 399
and screenY0 = 51.607337 - screenYscale * 781
=> 51.606733 - screenYscale * 399 = 51.607337 - screenYscale * 781
=> screenYscale * 781 - screenYscale * 399 = 51.607337 - 51.606733
=> screenYscale * 382 = 0.000604
=> screenYscale = 0.00000158115
So each pixel on your map represents 0.00000158115 of a degree of longitude. Plugging in to find the origin:
screenY0 + screenYscale * 399 = 51.606733
=> screenY0 + 0.00000158115 * 399 = 51.606733
=> screenY0 + 0.00063087885 = 51.606733
=> screenY0 = 51.606733 - 0.00063087885
=> screenY0 = 51.6061021212
Therefore the pixel at 0 represents 51.6061021212 in the real world.
Formula
Our formula to find the real world latitude is thus:
51.6061021212 + 0.00000158115 * screenYpoint = worldYpoint.
Testing
Let's test this with your other reference latitude: 51.606671 -> 402
51.6061021212 + 0.00000158115 * screenYpoint = worldYpoint
51.6061021212 + 0.00000158115 * 402 = 51.606671
51.6061021212 + 0.0006356223 = 51.606671
51.6067377435 = 51.606671
This is approximately equal; considering that 1 degree of latitude is equal to 111.2km (at the Earth's mean radius) this corresponds to an error of about 7.4 meters.
Hope this helps and gets you on your way to solving for longitude, AKA x, as well! If you have any trouble or would like me to clarify, please leave a comment.