I have the following code which I would like to input in an unordered_set for the fastest access possible based on its 3D coordinates.
struct MeshVertex {
float x;
float y;
float z;
std::vector<MeshVertex*> links;
float& operator [](int i) {
switch (i) {
case 0:
return x;
case 1:
return y;
case 2:
return z;
default:
throw std::out_of_range("");
}
}
MeshVertex& operator=(const STLFileVertex& other)
{
x = other.x;
y = other.y;
z = other.z;
return *this;
}
bool operator<(const MeshVertex& other) const
{
if (x == other.x) {
if (y == other.y)
return z < other.z;
return y < other.y;
}
return x < other.x;
}
bool operator==(const MeshVertex& other) const {
if (x == other.x && y == other.y && z == other.z)
return true;
return false;
}
bool operator!=(const MeshVertex& other) const {
if (x == other.x && y == other.y && z == other.z)
return false;
return true;
}
double distance(const MeshVertex& other) const {
return hypot(hypot(x - other.x, y - other.y), z - other.z);
}
};
As expected, I get the following error:
The C++ Standard doesn't provide a hash for this type.
How do I implement a well performing hash for this? It only has to contain the members x, y and z and the combination of hash and comparison must be 100% free of collisions, meaning a vertex can only be replaced if the newly inserted one has exactly the same values. Please consider that the vertexes are represented by float
type. Which means a normal comparison could be misleading.
Edit:
What I'm really doing is reading an STL (stereolithography) binary file. Those familiar with the STL format will know that triangles are written like this in the file:
float normals[3];
float vertex[3][3];
uint16_t attributes;
That means vertexes will be duplicated more than often when reading the file. I want to normalize vertexes (removing duplicates) and apply links to recreate the triangles.
My goal is mapping all the vertexes in a graph, through Breadth-first search. The links, as previously mentioned, are available in the triangles. After getting all the links, the triangles become disposable.
If I don't have a perfect mean to remove duplicated vertexes, links will become broken and my mapping will fail.