1

I use Keras to run Neural Network, and it encounters this issue:

W tensorflow/core/framework/allocator.cc:101]:
Allocation of 19267584 exceeds 10% of system memory.

I already saw some discussion about this, like changing the batch size (which works for me) but I also want to know if there is a permanent way to solve this issue, for example, I could install one extra SSD to increase the physical storage of my Linux system.

I'm new to Linux and Machine Learning, so I'm not sure if this issue could be simply solved like this. Please let us know, thank you in advance.

Some discussion about this allocation exceeds 10% of system memory could be found below to provide you more context:

https://github.com/tensorflow/tensorflow/issues/18736

Tensorflow Allocation Memory: Allocation of 38535168 exceeds 10% of system memory

Jason
  • 3,166
  • 3
  • 20
  • 37
  • Allocation memory means that your system doesn't have suficient memory to allocate all data you using. It occurs by many factors, batch size or depending how you are processing your data. – Alex Colombari Aug 26 '19 at 17:30
  • 2
    In processing-heavy applications, it's likely that you would need to add working memory, not storage memory (RAM, not disk space), but I don't know enough about this particular error to say with 100% confidence – G. Anderson Aug 26 '19 at 17:33
  • You may want to show the relevant portion of `allocator.cc`. What does Keras mean when it says "system memory"? Is it virtual memory, or RAM? If it is virtual memory, then increasing the swap file should work (and a SSD or NVMe drive may improve speed). If it is RAM, you have to throw RAM at the problem. In general, you get best performance increase with RAM. – jww Aug 26 '19 at 17:36

0 Answers0