I have an array (lons) of longitude values in the range [-180, 180]. I need to find the mean of the time series. This is easily done with
np.mean(lons)
This straight forward mean, of course, doesn't work if the series contains values either side of the dateline. What is the correct way of calculating the mean for all possible cases? Note, I would rather not have a condition that treats dateline crossing cases differently.
I've played around with np.unwrap after converting from degrees to rad, but I know my calculations are wrong because a small percentage of cases are giving me mean longitudes somewhere near 0 degrees (the meridian) over Africa. These aren't possible as this is an ocean data set.
Thanks.
EDIT: I now realise a more precise way of calculating the mean [lat, lon] position of a time series might be to convert to a cartesian grid. I may go down this route.