0

I was working on c++ program to construct molecules and perform vector operations on them.To get the probability in some particular problem, I had to take multiple random observations and calculate ratio. My program works fine when I took 1000 outcomes, but got stuck nearly forever when I took 10000 outcomes (Far more than 10 times the time). Also The computer became irresponsive. So to find out where the problem was, after a bit of debugging I came down to this: (My Vector rotation Algorithm) `

int main()
{
    Vector v1(3,4,0);
    geom3D::EulerAngle EA(90,0,0);
    int K=120000;
    for(int i=0;i<K;i++) 
    {
            v1=rotateVector(v1,EA);
            if(i%3000==0)
                cout << i << "\n";
    }
    cout << "Done\n";
}

It was supposed to print 3000,6000 ... and I expected it would at regular intervals, however, I got the same problem here too: Up 51000, it prints pretty fast, then again the computer becomes irresponsive. My guess was that it is about memory allocation, that it was allocating a lot of memory in the rotateVector() which it wasn't freeing, so I tried adding delete statements wherever I could. But I still couldn't find a good solution to this. Is such behaviour normal? Is there a solution? Also, are delete statement all that I can do to free memory?

Sreyas Adury
  • 136
  • 10
  • 4
    "adding delete statements wherever I could" seems to me definitely a "spray and pray" approach... You free memory when it's no longer needed, not at random, and writing cleanup code as an afterthought is a recipe for memory leaks. Besides, if you are dealing with arrays why are you using raw `new` and `delete` instead of `std::vector`? Most importantly, given that you say the problem is most probably in `rotateVector`, why didn't you post it? – Matteo Italia Dec 20 '18 at 01:47
  • Yes, you probably have a memory leak. To actually fix that (rather than just put in delete statements and hope for the best) you usually want to use RAII, so you create objects which allocate the necessary data when they're created and free it when they're destroyed. To get at least a somewhat better idea of whether you're looking at a memory leak, you might want to watch the program's memory consumption over time. If it's constantly rising, that's a good sign of a leak (though not all leaks are easily observed this way). – Jerry Coffin Dec 20 '18 at 02:14
  • You need a quantum computer... – takintoolong Dec 20 '18 at 02:19
  • By "Wherever I could", I mean wherever there was no need for the object afterwards. Thanks for the pointers though. Will look in – Sreyas Adury Dec 20 '18 at 02:41
  • try valgrind to look for memory leaks. – Serge Dec 20 '18 at 03:11
  • 1
    Assuming that your algorithm (or a memory leak) isn't causing your computer to run out of RAM and start swapping (which would definitely slow things down dramatically), the other explanation is that you are running an algorithm with greater-than-linear scaling behavior. O(N^2) algorithms, in particular, are notorious for performing adequately at small problem-sizes, and then rapidly deteriorating into intractability as the problem-size gets larger. You should run your program under a profiler and see where it spends its time, then see if you can rewrite that algorithm to be more efficient. – Jeremy Friesner Dec 20 '18 at 05:45
  • Supply the rotateVector code. Nobody will be able to assist if you don't supply the code that is causing the error. – bradgonesurfing Dec 20 '18 at 08:21

1 Answers1

-1

You have found a limit to your computers resources. It is normal. As awesome as computers seem to be under normal usage, they do have limits in their capabilities. Because of this, sometimes you have to find a creative way to limit your program's by putting limits on the number of iterations or even the length of numbers. Sometimes this must be done at the point where the user enters and input, limiting the possible inputs. To further illustrate, I have two examples which cause a slow down from high repetition from my own experience. First example: Create an Excel file and input the random number function in a cell. Copy the cell horizontally 10000 cells. Then copy that whole row down 10000 rows. Press F9 to recalculate. Second example: Create an Adobe Illustrator file. Create a vector graphic with several vectors. Copy and paste the graphic 10000 times. Save and close the project. Reopen the project.

takintoolong
  • 140
  • 9
  • 1
    You're taking a wild guess with inadequate evidence to support the conclusion you've reached. It's possible that you're right--but it strikes me as fairly doubtful. – Jerry Coffin Dec 20 '18 at 02:34
  • I didn't "reach the limit" (I hope). I keep replacing the same Vector over and over. It can't take any more placec then one Vector object right? – Sreyas Adury Dec 20 '18 at 02:37
  • It is not a wild guess. If anything it has a lack of technical explanation. The OP and commentators have suggested memory issues. It is quite normal for a computer to slow down when overusing such resources. I was primarily answering the OP's question of if it is normal and offering other examples of large amounts of calculation slowing things down. – takintoolong Dec 20 '18 at 02:39
  • @Sreyas Adury I am referring to the for loop, as the answer and calculation gets larger the computer has to work harder to get the answer – takintoolong Dec 20 '18 at 02:51
  • @SreyasAdury Sorry I can't explain better, I am just suggesting that sometimes computers can't handle large numbers of calculations. That you are working with molecules...this is why people are working so hard to perfect quantum computers...to perform complex calculations with molecules...among other things. Sometimes accept and maybe setting "int K" to a lower number. – takintoolong Dec 20 '18 at 03:04