2

I have dataframe with measurements coordinates and cell coordinates.

I need to find for each row angle (azimuth angle) between a line that connects these two points and the north pole.

df:

id     cell_lat     cell_long     meas_lat     meas_long
1      53.543643    11.636235     53.44758     11.03720
2      52.988823    10.0421645    53.03501     9.04165
3      54.013442    9.100981      53.90384     10.62370

I have found some code online, but none if that really helps me get any closer to the solution.

I have used this function but not sure if get it right and I guess there is simplier solution.

Any help or hint is welcomed, thanks in advance.

Community
  • 1
  • 1
jovicbg
  • 1,523
  • 8
  • 34
  • 74
  • Possible duplicate of [Python code to calculate angle between three points (lat long coordinates)](https://stackoverflow.com/questions/42584259/python-code-to-calculate-angle-between-three-points-lat-long-coordinates) – Bill Armstrong Jul 07 '18 at 07:28
  • See [THIS](https://stackoverflow.com/a/42595558/3180386) answer - covers both a 2d and 3d (more important given the distances since you're using the North Pole in your calculation. – Bill Armstrong Jul 07 '18 at 07:30

1 Answers1

4

The trickiest part of this problem is converting geodetic (latitude, longitude) coordinates to Cartesian (x, y, z) coordinates. If you look at https://en.wikipedia.org/wiki/Geographic_coordinate_conversion you can see how to do this, which involves choosing a reference system. Assuming we choose ECEF (https://en.wikipedia.org/wiki/ECEF), the following code calculates the angles you are looking for:

def vector_calc(lat, long, ht):
    '''
    Calculates the vector from a specified point on the Earth's surface to the North Pole.
    '''
    a = 6378137.0  # Equatorial radius of the Earth
    b = 6356752.314245  # Polar radius of the Earth

    e_squared = 1 - ((b ** 2) / (a ** 2))  # e is the eccentricity of the Earth
    n_phi = a / (np.sqrt(1 - (e_squared * (np.sin(lat) ** 2))))

    x = (n_phi + ht) * np.cos(lat) * np.cos(long)
    y = (n_phi + ht) * np.cos(lat) * np.sin(long)
    z = ((((b ** 2) / (a ** 2)) * n_phi) + ht) * np.sin(lat)

    x_npole = 0.0
    y_npole = 6378137.0
    z_npole = 0.0

    v = ((x_npole - x), (y_npole - y), (z_npole - z))

    return v

def angle_calc(lat1, long1, lat2, long2, ht1=0, ht2=0):
    '''
    Calculates the angle between the vectors from 2 points to the North Pole.
    '''
    # Convert from degrees to radians
    lat1_rad = (lat1 / 180) * np.pi
    long1_rad = (long1 / 180) * np.pi
    lat2_rad = (lat2 / 180) * np.pi
    long2_rad = (long2 / 180) * np.pi

    v1 = vector_calc(lat1_rad, long1_rad, ht1)
    v2 = vector_calc(lat2_rad, long2_rad, ht2)

    # The angle between two vectors, vect1 and vect2 is given by:
    # arccos[vect1.vect2 / |vect1||vect2|]
    dot = np.dot(v1, v2)  # The dot product of the two vectors
    v1_mag = np.linalg.norm(v1)  # The magnitude of the vector v1
    v2_mag = np.linalg.norm(v2)  # The magnitude of the vector v2

    theta_rad = np.arccos(dot / (v1_mag * v2_mag))
    # Convert radians back to degrees
    theta = (theta_rad / np.pi) * 180

    return theta

angles = []
for row in range(df.shape[0]):
    cell_lat = df.iloc[row]['cell_lat']
    cell_long = df.iloc[row]['cell_long']
    meas_lat = df.iloc[row]['meas_lat']
    meas_long = df.iloc[row]['meas_long']

    angle = angle_calc(cell_lat, cell_long, meas_lat, meas_long)

    angles.append(angle)

This will read each row out of your dataframe, calculate the angle and append it to the list angles. Obviously you can do what you like with those angles after they've been calculated.

Hope that helps!

liamvharris
  • 350
  • 3
  • 16