I've been trying to measure geographic distance between points in a GeoDataframe (gdb['geometry']) and one specific point, let's say, b_xy.
gdb['geometry'] is a geodatabase containing tuples of lon/lat coordinates as such:
geometry |
---|
POINT (-73.958 40.685) |
POINT (-73.995 40.663) |
POINT (-73.982 40.756) |
Whereas b_xy is a simple lon/lat coordinate: (40.757280550000004, -73.98585503545917)
The code given to my professor from the textbook/tutorial he claims to be using for this example, is as such:
'd2b = lambda pt: cdist([(pt.x, pt.y)], [b_xy])[0][0]*10 #hasilnya degrees/radians
gdb['d2tsquare'] = gdb['geometry'].to_crs(tgt_crs)
.apply(d2b)'
which gives out a weird output that is presumably in degrees/radians, despite using a projected crs for tgt_crs
I've been trying to use this tutorial on measuring distances between two points, in meters. However, geopy.distance is unable to calculate from a tuple and can only perform singular inputs; it cannot accept data from a geodataframe.
I'm at a loss here for a method that works. Been thinking about making a loop for it but not sure where to start.