2

I have a program that solves generally for 1D brownian motion using an Euler's Method. Being a stochastic process, I want to average it over many particles. But I find that as I ramp up the number of particles, it overloads and i get the std::badalloc error, which I understand is a memory error.

Here is my full code

#include <iostream>
#include <vector>
#include <fstream>
#include <cmath>
#include <cstdlib>
#include <limits>
#include <ctime>

using namespace std;

// Box-Muller Method to generate gaussian numbers
double generateGaussianNoise(double mu, double sigma) {
    const double epsilon = std::numeric_limits<double>::min();
    const double tau = 2.0 * 3.14159265358979323846;

    static double z0, z1;
    static bool generate;
    generate = !generate;

    if (!generate) return z1 * sigma + mu;

    double u1, u2;
    do {
        u1 = rand() * (1.0 / RAND_MAX);
        u2 = rand() * (1.0 / RAND_MAX);
    } while (u1 <= epsilon);

    z0 = sqrt(-2.0 * log(u1)) * cos(tau * u2);
    z1 = sqrt(-2.0 * log(u1)) * sin(tau * u2);
    return z0 * sigma + mu;
}

int main() {
    // Initialize Variables
    double gg;  // Gaussian Number Picked  from distribution

    // Integrator
    double t0 = 0;  // Setting the Time Window
    double tf = 10;
    double n = 5000;           // Number of Steps
    double h = (tf - t0) / n;  // Time Step Size

    // Set Constants
    const double pii = atan(1) * 4;  // pi
    const double eta = 1;            // viscous constant
    const double m = 1;              // mass
    const double aa = 1;             // radius
    const double Temp = 30;          // Temperature in Kelvins
    const double KB = 1;             // Boltzmann Constant
    const double alpha = (6 * pii * eta * aa);

    // More Constants
    const double mu = 0;  // Gaussian Mean
    const double sigma = 1;  // Gaussian Std Deviation
    const double ng = n;  // No. of pts to generate for Gauss distribution
    const double npart = 1000;  // No. of Particles

    // Initial Conditions
    double x0 = 0;
    double y0 = 0;
    double t = t0;
    // Vectors
    vector<double> storX;  // Vector that keeps displacement values
    vector<double> storY;  // Vector that keeps velocity values

    vector<double> storT;  // Vector to store time
    vector<double> storeGaussian;  // Vector to store Gaussian numbers generated

    vector<double> holder;  // Placeholder Vector for calculation   operations
    vector<double> mainstore;  // Vector that holds the final value desired

    storT.push_back(t0);

    // Prepares mainstore
    for (int z = 0; z < (n+1); z++) {
        mainstore.push_back(0);
    }

    for (int NN = 0; NN < npart; NN++) {
        holder.clear();
        storX.clear();
        storY.clear();
        storT.clear();
        storT.push_back(0);

        // Prepares holder
        for (int z = 0; z < (n+1); z++) {
            holder.push_back(0);

            storX.push_back(0);

            storY.push_back(0);
        }

        // Gaussian Generator
        srand(time(NULL));
        for (double iiii = 0; iiii < ng; iiii++) {
            gg = generateGaussianNoise(0, 1);  // generateGaussianNoise(mu,sigma)
            storeGaussian.push_back(gg);
        }

        // Solver
        for (int ii = 0; ii < n; ii++) {
            storY[ii + 1] =
                storY[ii] - (alpha / m) * storY[ii] * h +
                (sqrt(2 * alpha * KB * Temp) / m) * sqrt(h) * storeGaussian[ii];
            storX[ii + 1] = storX[ii] + storY[ii] * h;
            holder[ii + 1] =
                pow(storX[ii + 1], 2);  // Finds the displacement squared

            t = t + h;
            storT.push_back(t);
        }

        // Updates the Main Storage
        for (int z = 0; z < storX.size(); z++) {
            mainstore[z] = mainstore[z] + holder[z];
        }
    }

    // Average over the number of particles
    for (int z = 0; z < storX.size(); z++) {
        mainstore[z] = mainstore[z] / (npart);
    }

    // Outputs the data
    ofstream fout("LangevinEulerTest.txt");
    for (int jj = 0; jj < storX.size(); jj++) {
        fout << storT[jj] << '\t' << mainstore[jj] << '\t' << storX[jj] << endl;
    }

    return 0;
}

As you can see, npart is the variable that I change to vary the number of particles. But after each iteration, I do clear my storage vectors like storX,storY... So on paper, the number of particles should not affect memory? I am only just calling the compiler to repeat many more times, and add onto the main storage vector mainstore. I am running my code on a computer with 4GB ram.

Would greatly appreciate it if anyone could point out my errors in logic or suggest improvements.

Edit: Currently the number of particles is set to npart = 1000. So when I try to ramp it up to like npart = 20000 or npart = 50000, it gives me memory errors.

Edit2 I've edited the code to allocate an extra index to each of the storage vectors. But it does not seem to fix the memory overflow

Candy Man
  • 219
  • 2
  • 12
  • 2
    Could you spend some time indenting your code properly, please? You'll get more people looking at it if it's easier to read. – Roger Lipscombe Sep 17 '15 at 09:10
  • my apologies. so you suggest I should break up the sections, so it isnt one long chunk? – Candy Man Sep 17 '15 at 09:11
  • 1
    @CandyMan I just fixed the indenting in your code. It seems to work well though. – syntagma Sep 17 '15 at 09:13
  • @REACHUS ah thanks, i see what you mean. I'll keep that in mind the next time. As for the number of particles (**npart**), I tried ramping it up to 10,000 or 50,000. which crashes the thing. So I'm not so sure on this part – Candy Man Sep 17 '15 at 09:15
  • 1
    @CandyMan: Have you passed this program over `valgrind` or anything? – 3442 Sep 17 '15 at 09:15
  • Valgrind also doesn't report anything worrisome when npart=1000. – syntagma Sep 17 '15 at 09:17
  • @KemyLand I have not heard of valgrind before, but I'm looking at it now! – Candy Man Sep 17 '15 at 09:18
  • 3
    Your solver loop accesses the vectors out of bounds in the last iteration. That could very well cause a bad_alloc at a later point in time. – ComicSansMS Sep 17 '15 at 09:19
  • `==27955== Using Valgrind-3.10.0.SVN and LibVEX; rerun with -h for copyright info ==27955== Command: ./c ==27955== ==27955== ==27955== HEAP SUMMARY: ==27955== in use at exit: 0 bytes in 0 blocks ==27955== total heap usage: 96 allocs, 96 frees, 134,881,800 bytes allocated ==27955== ==27955== All heap blocks were freed -- no leaks are possible ==27955== ==27955== For counts of detected and suppressed errors, rerun with: -v ==27955== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 0 from 0)` – 3442 Sep 17 '15 at 09:19
  • @ComicSansMS: Valgrind says otherwise... – 3442 Sep 17 '15 at 09:19
  • 1
    Ran the program on my machine with `npart = 10000` and it didn't crashed (I have 2GB RAM, BTW). Maybe you have too little RAM? Passing this by Valgrind now! – 3442 Sep 17 '15 at 09:21
  • 1
    @KemyLand Well, I don't need valgrind to see that `storY[n]` on a vector with `n` elements is out-of-bounds. – ComicSansMS Sep 17 '15 at 09:21
  • @ComicSansMS: But I don't see how does an off-by-one error may throw `std::bad_alloc` instead of `std::out_of_range`... The standard doesn't mandates this, but it's not too logical from a library implementator's point of view. – 3442 Sep 17 '15 at 09:23
  • 3
    @KemyLand Accessing using `vector[x]` doesn't do range-checks, you need `vector.at(x)` for that. So you'll corrupt memory and that can cause anything. – Martin C. Sep 17 '15 at 09:24
  • 1
    Not the cause of the problem, but you should not call srand in a loop, only once at startup! – Werner Henze Sep 17 '15 at 09:24
  • @KemyLand The standard mandates undefined behavior, which can very well lead to a `bad_alloc` in this case (or demons flying out of your nostrils, but I guess the Op just got lucky here). – ComicSansMS Sep 17 '15 at 09:24
  • @ComicSansMS: That's what I meant, the standard doesn't mandates anything, but it's still nonsense anyway for the library to throw `std::bad_alloc`... – 3442 Sep 17 '15 at 09:25
  • @Martin C: Depends on your runtime. I use visual studio and it does when debugging. – Gombat Sep 17 '15 at 09:25
  • @WernerHenze: Why should you? – 3442 Sep 17 '15 at 09:26
  • 1
    @Gombat it is allowed to do runtime checks for `operator[]`, since the standard indicates that `Portable programs should never call this function with an argument n that is out of range, since this causes undefined behavior.`. `vector::at` _needs_ to be range-checked. The unchecked variant can do whatever it wants when being called out-of-range. – Martin C. Sep 17 '15 at 09:28
  • @WernerHenze I think call srand in the loop because I want the gaussian points generated for each different particle to be different. So do you mean that actually by putting it outside, it doesn't make a difference? – Candy Man Sep 17 '15 at 09:29
  • 1
    @CandyMan You can call srand in the loop, but it does not make sense. See https://stackoverflow.com/questions/13519965/c-program-srand and other answers on SO when you search for "srand". – Werner Henze Sep 17 '15 at 09:59

2 Answers2

2

There is an out of bounds exception in the solver part. storY has size n and you access ii+1 where i goes up to n-1. So for your code provided. storY has size 5000. It is allowed to access with indices between 0 and 4999 (including) but you try to access with index 5000. The same for storX, holder and mainstore.

Also, storeGaussian does not get cleared before adding new variables. It grows by n for each npart loop. You access only the first n values of it in the solver part anyway.

Gombat
  • 1,994
  • 16
  • 21
  • Thanks for spotting that (also @ComicSansMS). I have now allocated a size of 5001 for `storY`, `storX` and `holder`, but the `std::badalloc` error is still persistent for say `npart = 20000` – Candy Man Sep 17 '15 at 09:39
  • Yeah, I missed `holder`, but `mainstore` is one too small, too. If I resize them all to one bigger, I get no memory leaks or bad_allocs. – Gombat Sep 17 '15 at 09:47
  • Ah yes, regarding `mainstore` I meant to type that in the earlier comment too. Just to be sure that I am thinking the same thing as you do, I would just change the loops that sizes them to something like `for (int z = 0; z < (n+1); z++)` right? - I did that and am still getting memory errors! – Candy Man Sep 17 '15 at 09:51
  • 1
    Yes, if that way, specified variables have size n+1, we talk about the same thing. But -> Is it right, that `storeGaussian` never get's cleared? In the solver part, you acces the first n values, but in the gaussian generator part, you add new values to the end. Setting n to 500000 has let me reproduce your error and clearing `storeGaussian` in front of the the gaussian generator for loop resolved it for me. – Gombat Sep 17 '15 at 10:07
1

Please note, that vector::clear removes all elements from the vector, but does not necessarily change the vector's capacity (i.e. it's storage array), see the documentation.

This won't cause the problem here, because you'll reuse the same array in the next runs, but it's something to be aware when using vectors.

Martin C.
  • 12,140
  • 7
  • 40
  • 52