As far as I understand unique_ptr<T>
is not supposed to have such a huge overhead.
What do I wrong?
size_t t = sizeof(DataHelper::SEQ_DATA); // t = 12
std::vector<std::vector<std::unique_ptr<DataHelper::SEQ_DATA>>> d(SEQ_00_SIZE + 1); // SEQ_00_SIZE = 4540
for (unsigned int i = 0; i < d.size(); ++i) {
for (unsigned int k = 0; k < 124668; ++k) {
std::unique_ptr<DataHelper::SEQ_DATA> sd = std::make_unique<DataHelper::SEQ_DATA>();
d[i].push_back(std::move(sd));
}
}
takes about ~21GB of ram.
std::vector<std::vector<DataHelper::SEQ_DATA>> d(SEQ_00_SIZE + 1);
for (unsigned int i = 0; i < d.size(); ++i) {
for (unsigned int k = 0; k < 124668; ++k) {
DataHelper::SEQ_DATA sd;
d[i].push_back(sd);
}
}
takes about ~6,5GB of ram.
Additional information:
struct SEQ_DATA {
uint16_t id = 0;
uint16_t label = 0;
float intensity = 0.0f;
float z = 0.0f;
};
I just want to have a single vector<vector<T>>
which holds my 4540 * 124668 objects as efficient as possible. I read values from binary files. Since the number of elements within the binary files varies, I cannot initialize the inner vector with the correct number (i.e. 124668 is only true for the first file).
gcc 9.3.0, c++ 17