2

I am writing a program in C++ for a embedded platform STM32. I used to use normal arrays but for the first time I am using vectors in it. So far it runs well in the majority of the cases. However in a very few cases I got the error stated in the title eventhough it shouldn't. Would appreciate some help as I run out of ideas.

My situation

The vectors are like this

struct Boundary{
  vector<unsigned short> x; //The x coordinates
  vector<unsigned short> y; //The y coordinates  
};

Everytime I use these vectors I clear them with

boundary0.x.clear();
boundary0.y.clear();

I add elements with the normal push_back

The strange part

Sometimes, the program finishes with the "Operator new out of memory" when adding elements to the vector.

"!Ah, you run out of memory!"- you would say, but that is the strange part. The vector so far has only 275 elements which being short gives 550 bytes.

But this very same program has handled the same vector with many more elements (500 or more) without problem.

Somehow, you previously leaked out memory!- can be said, and I suspect that. Perhaps I used this before and failed to clean it up (although I cleaned it as I stated) but this error appears even when I disconnect and connect the processor wiping out any previous used memory.

I am at lost why this could be happening. Any help or comment or advice greatly appreciated.

KansaiRobot
  • 7,564
  • 11
  • 71
  • 150
  • To detect a memory leak, you probably want to use either valgrind or asan (leak detector in clang). – Justin Jun 07 '17 at 06:26
  • 1
    Presumably you weren't dynamically resizing the arrays previously, but you are resizing the vector version. A vector's `clear()` member function does not release memory. It affects the vector's `size()`, not its `capacity()`. If you don't understand the distinction, read the documentation for those member functions. Beyond that, not possible to help, since you've provided no relevant information. Try providing an [mcve]. – Peter Jun 07 '17 at 06:37
  • Thanks,I understand the difference. Is there a way to release memory for the vector? On the other hand, I don't know how to create a MCV example since this works in the majority of the cases and have no idea how to make it fail on purpose. My apologies – KansaiRobot Jun 07 '17 at 06:45
  • Also when this happens, it does in the very first time when the vector is being used.(so no memory left from before) – KansaiRobot Jun 07 '17 at 06:46

2 Answers2

5

What you need is to reduce vector capacity to its size after vector has been used and clear was performed.

C++11 solution:

Use shrink_to_fit() after clear() function to release memory allocated previously.

boundary0.x.clear();
boundary0.x.shrink_to_fit();
boundary0.y.clear();
boundary0.y.shrink_to_fit();

It will reduce capacity of vector to be equal to its size which after clear() equals to zero. Note, that shrink_to_fit introduced since C++11.

C++03 and earlier:

'Swap-to-fit' idiom can be used to have same as shrink_to_fit effect.

std::vector<T>(my_vector.begin(), my_vector.end()).swap(my_vector);

will reduce my_vector capacity to its size.

This idiom is described here with detailed explanation how exactly it works: https://en.wikibooks.org/wiki/More_C%2B%2B_Idioms/Shrink-to-fit

k.v.
  • 1,193
  • 8
  • 11
2

When adding an element to a vector using Vector::push_back and the number of elements reach its initial capacity, then the internally reserved buffer will be reallocated (i.e. the existing one might be freed and a larger chunk of memory is allocated). This might "fragment" your memory, as smaller chunks of free memory get available yet the system requires larger chunks, which at some point it might not find any more if the system has rather low memory). Hence, if you do this very often with a lot of vectors, it could get a problem on an embedded system.

Hard to say if this is actually the reason - but I'd try to initialize the vector with a capacity that it will most likely not overreach. Maybe that solves your problem. So you could try out:

struct Boundary{
    vector<unsigned short> x = vector<unsigned short>(500); //The x coordinates
    vector<unsigned short> y = vector<unsigned short>(500); //The y coordinates
};
Stephan Lechner
  • 34,891
  • 4
  • 35
  • 58
  • Thank you, I will try that and report. However, doesn't that defeat the purpose of using vector? I mean, I decided to use vector in the first place for several reasons, one was, that since the size of the vector changes everytime, I tried it that the size is not fixed since the beginning to save memory. - maybe I am mistaking my objectives here – KansaiRobot Jun 07 '17 at 06:35
  • If this is the problem, you can try using `std::deque`, which doesn't have to allocate one contiguous chunk for all the data – Caleth Jun 07 '17 at 06:47
  • Moreover, when reallocating, allocator might not be able to grow buffer, it means vector must allocate new buffer, copy/move elements from old buffer to new buffer and free old buffer. Which means, there might be a moment when both old and new buffers coexist, so there must be enough memory to fit both. – el.pescado - нет войне Jun 07 '17 at 07:09
  • is there any way to know that the vector is being reallocated? perhaps checking capacity() at first? In any case it is very strange because the program handles larger vectors without problem almost all of the time..I can't even figure out what is different from the failure and success cases – KansaiRobot Jun 07 '17 at 15:15
  • Yes, you can check capacity. Cf, for example, https://stackoverflow.com/a/5410129/2630032 – Stephan Lechner Jun 07 '17 at 15:24