0

I am relatively new to C++ and I've having an issue with my project. I ran the following code a few times without a problem, but now when I try to run it it gives me a std::bad_alloc error. The code is C++ but some lines are exclusive to ROOT which is program written in C++ for particle physicists.

class Particle {
public:
    int pdgid;
    float px;

    Particle(int pdg, float px){
        pdgid = pdg;
        px = px;
    }
};

TFile* file = new TFile("filename.root");      //ROOT code, where particle values are obtained from. 
TTree* tree = (TTree*)file->Get("FlatTree");   //tree is where all events and associated values are held

vector<Particle> allparticles;

for (unsigned iEntry = 0; iEntry<tree->GetEntries(); iEntry++) {
    tree->GetEntry(iEntry);
    for (int iVecEntry = 0; iVecEntry < nfsp; iVecEntry++) {
        allparticles.push_back(Particle(pdg[iVecEntry],px[iVecEntry]));
    }
}

The code works if I decrease the limit of the first for loop. The number of entries is quite large (over 2 million) and nfsp can be up to 24 depending on the event. This resulted in the vector allparticles having over 7 million Particle objects.

I think the problem lies with not having enough memory to allocate such a large vector but how was this working previously? Is it possible that the memory wasn't deallocated properly the first few times I ran the code?

I a bit confused about memory management. In C++ does the OS handle deallocation? Or do I have to include a destructor?

I have tried including a destructor but could not get it to work. From "std::bad_alloc": am I using too much memory? I tried including a delete[] statement at the end of the code but this also doesn't work.

Any input and help is much appreciated!

P.S. I'm running linux mint 18.2 Sonya.

1 Answers1

0

Yes, it sounds like you have run out of stack memory. Here is one of the many tutorials out there that explain heap vs stack memory.

You are creating your particles in stack memory, so this means that they will be automatically destroyed when they go out of scope. You stack memory size varies depending on the compiler and environment, but you get way less stack memory than heap memory.

To fix this, I would create a vector of pointers to Particles, and create the Particles dynamically. Example:

vector<Particle*> allparticles;
...
allparticles.push_back(new Particle(pdg[iVecEntry],px[iVecEntry]));

Remember, to delete the dynamically allocated heap memory when you are done with it. Example:

for(int i < 0; i < allparticles.size(); i++){
    delete allparticles[i];
}
Mike
  • 562
  • 3
  • 15
  • 3
    This is incorrect. The vector is on the stack, but the contents of the vector are not. `std::vector` manages heap memory to hold its contents. You don't need extra indirection and manual memory management. – Adrian McCarthy Jan 23 '18 at 18:39