I have real-time GPS data coming at 5 updates per second. On average 80% of the data is fairly accurate; but around 20% of the data is jerky. Plus occasionally we also get an outlier i.e. an erroneous data point far away from the actual trajectory.
I am looking for an algorithm that could me do achieve the following:
- Smooth out the data so that jerkiness is eliminated.
- Not to smooth out the outlier data but rather eliminate those erroneous data points and replace with some extrapolated value.
To give some context, I first searched stackoverflow.com site for some similar topic and found the following link:
My software engineer implemented the KalmanLatLong routine that was provided in the above link; but we encountered the following issues:
The algorithm is lagging behind meaning while the algo is generating extrapolated values, more GPS data points arrive (remember the data is coming in real-time).
In case of an occasional outlier, the algo smooths it out is well. Whereas out goal is to eliminate such outliers because they are erroneous data.
I am looking for an algorithm that could work in real-time and handle GPS updates at 5 Hz and smoothen the data while eliminating outliers.
Your help would be highly appreciated.