I am using the ConvexHull class of scipy to construct a convex hull for a set of points. I am interested in a way to compute the minimum distance of a new point P from the convex hull.
With the help of the internet and a little tweaking by myself I came up with this formula to compute the distance of a point P or a set of points points to the convex hull facets:
np.max(np.dot(self.equations[:, :-1], points.T).T + self.equations[:, -1], axis=-1)
For a convex hull in 2D the equation above will result in the following plot:
As you can see the result is pretty good and correct for points within the convex hull (The distance here is negative and would need to be multiplied with -1). It is also correct for points that are closest to a facet but incorrect for points that are closest to a vertex of the convex hull. (I marked these regions with the dashed lines) For these points the correct minimum distance would be the minimum distance to the convex hull vertices.
How can I distinguish between points that are closest to a facet or closest to a vertex to correctly compute the minimum distance to the convex hull for a point P or a set of points points in an n-Dimensional space (At least 3D)?