In my initial question (detailed experimental investigation): Appropriate container for the fast insertion and lookup of n-dimensional real vectors (initial benchmarking provided) I got really strange behaviour using the unsorted set for management of random N-dimensional float arrays with my initial (likely poor designed Hash function):
#include <iostream>
#include <chrono>
#include <random>
#include <array>
#include <unordered_set>
const int N = 3; // Dimensionality of the arrays
std::array<double, N> getRandomArray() {
// Engines and distributions retain state, thus defined as static
static std::default_random_engine e; // engine
static std::uniform_real_distribution<double> d(0, 1); // distribution
std::array<double, N> ret;
for (size_t i = 0; i < N; ++i) {
ret[i] = d(e);
}
return ret;
}
// Return Squared Euclidean Distance
template <typename InputIt1, typename InputIt2>
double EuclideanDistance2(InputIt1 beg1, InputIt1 end1, InputIt2 beg2) {
double val = 0.0;
while (beg1 != end1) {
double dist = (*beg1++) - (*beg2++);
val += dist*dist;
}
return val;
}
struct ArrayHash { // Hash Function
std::size_t operator() (const std::array<double, N>& arr) const {
std::size_t ret = 0;
for (const double elem : arr) {
ret += std::hash<double>()(elem);
}
return ret;
}
};
struct ArrayEqual { // Equivalence Criterion
bool operator() (const std::array<double, N>& arr1,
const std::array<double, N>& arr2) const {
return EuclideanDistance2(arr1.begin(), arr1.end(), arr2.begin()) < tol*tol;
}
private:
static constexpr double tol = 1e-6; // Comparison tolerance
};
int main() {
// create a unordered set of double arrays (usda)
std::unordered_set<std::array<double, N>, ArrayHash, ArrayEqual> usda;
// record start time
auto start = std::chrono::steady_clock::now();
// Generate and insert one hundred thousands new double arrays
for (size_t i = 0; i < 100000; ++i) {
// Get a new random double array (da)
std::array<double, N> da = getRandomArray();
usda.insert(da);
}
// record finish time
auto end = std::chrono::steady_clock::now();
std::chrono::duration<double> diff = end - start;
std::cout << "Time to generate and insert unique elements into UNORD. SET: "
<< diff.count() << " s\n";
std::cout << "unord. set size() = " << usda.size() << std::endl;
return 0;
}
Two the most strange things are:
- Running experiments without any optimization flags even with loose tolerance (1e-1) almost all random vectors (implemented as N-dimensional arrays) were identified as unique. I haven't observed this using
vectors
andsets
. - While turning on -O3 optimization flag, the number of unique elements significantly differs from the numbers without optimization and this is for sure states that there's something wrong with my approach.
Edit: 2-nd problem solved taking into account @WhozCraig remark.
So, my question is: is this strange behaviour because my hash function is badly designed? If so, can you please suggest how to make a better hash function for my case?