How to calculate the L1 and L2 norm of angle values wrt the ground truths (where Angle values have -180 degrees to 180 degrees period)?
If I have an array of data ip_data[]
and an array of ground truth values gt[]
then the L1 and L2 norm can be calculated:
L1 norm = sum(abs(gt[i] - ip_data[i]))
L2 norm = sqrt(sum( (gt[i] - ip_data[i])^2 ))
However, generally, in the case of the angle values, they have -180 degrees to 180 degrees period. So, let's say
gt[] = [175, 179, 177]
ip_data_a[] = [165, 169, 167]
ip_data_b[] = [-175, -171, -173]
For ip_data_a[]
,
L1 = |175 - 165| + |179 - 169| + |177 - 167| = 30
L2 = sqrt(10^2 + 10^2 + 10^2) = sqrt(300) = 17.32
For ip_data_b[]
,
L1 = |175 - (-175)| + |179 - (-171)| + |177 - (-173)| = 1050
L2 = sqrt(350^2 + 350^2 + 350^2) = sqrt(367500) = 606.22
As can be seen, there is a huge difference in L1
(or L2
) for ip_data_a[]
v/s ip_data_b[]
even though all the values in both the data array are exactly 10 degrees away from the ground truth except in opposite directions.
Normalizing these angles from [-180, 180] to [0, 360] does not help. Because the same thing will happen when the GT angle is close to 360 deg and the given angle value is close to 0 deg or vice-versa.
So, in this type of scenario/data, what is the correct way of calculating the L1 and L2 norm so that data can be assessed properly?