I am having trouble determining the distance (in miles) between a point and 2 geographic coordinates using lat/lon.
# line segment point 1
x1 = -75.9667128 # longitude
y1 = 41.66279222 # latitude
# line segment point 2
x2 = -75.96248381 # longitude
y2 = 41.65800548 # latitude
# point to measure the orthogonal distance to
x3 = -75.96017288 # latitude
y3 = 41.67049662 # latitude
d = abs((x2-x1)*(y1-y3) - (x1-x3)*(y2-y1)) / np.sqrt(np.square(x2-x1) + np.square(y2-y1))
This yields 0.010002193890447786
When I convert this from degrees to miles using a scaling factor of 69.2 (I found this online), I get a different answer than when using a ruler, measuring between the line and the point, on a GIS site to double check myself.
Am I missing something here? When I plot out something like:
plt.plot([x1,x1+0.010002193890447786],[y1,y1])
the length of that line matches up with the correct distance.