I am trying to use the SQL Server 2014 functions to determine distance between two points on a geographical surface. I have three fields in a table (Lat, Long, Coordinates). [Lat] and [Long] are existing values and I store the geographical point coordinates in the [Coordinates] field using the following:
UPDATE dbo.[MyTable]
SET [Coordinates] = geography::STPointFromText('POINT(' + CAST([Lon] AS VARCHAR(20)) + ' ' + CAST([Lat] AS VARCHAR(20)) + ')', 4326) ;
So now I have a table full of records that have the geographical coordinates pre-computed in the [Coordinates] field. Now I want to determine the distance in miles between Point_A and Point_B. I used the following:
-- Compute Point_A:
DECLARE @g geography = (SELECT [Coordinates] FROM [MyTable] WHERE [Id] = 68);
-- Compute Point_B:
DECLARE @h geography = (SELECT [Coordinates] FROM [MyTable] WHERE [Id] = 1439);
-- Compute Distance:
SELECT ROUND(@g.STDistance(@h)) AS [Distance];
The actual distance is about 20 miles but this computation is giving me a number that is thousands of times greater than 20 miles.
Is .STDistance returning meters instead of miles?
On a related note: can anyone point me to an example on the web where one matches thousands of geographical points in a table with the nearest geographical points in another table containing thousand of points? I can see this computation taking a very long time if I can't find a way to shorten the process.