2

I have the following code:

#include <stdint.h>
#include <stdio.h>
#include <stdlib.h>
#include <iostream>

typedef uint64_t     Huge_num;  // 8 bytes
typedef Huge_num *   Huge_Arr; // Array of 8 byte elements

using namespace std;

// Function: Return a huge array
static Huge_Arr* requestMem(int n)
{
   cout << "requesting :" 
        << (sizeof(Huge_num)*n)/(1024*1024*1024) 
        << "GB" << endl;

   Huge_Arr *buff;
   try {
      buff = new Huge_Arr[n];
   }
   catch (bad_alloc){
      cout << "Not enough mem!" << endl; 
      exit(-1);
   }
   return buff;
}

// Main
int main(){
    for (int i=1; i< 10; i++){
         Huge_Arr *mem = requestMem(i*90000000);
         delete mem;
    }
}

For some reason malloc is only able to grab no more than 2GB of memory before throwing bad_alloc() error.

I understand that on 32-bit systems, the maximum address space is indeed on same order of 2GB.

But I am using Ubuntu 12.04 x86_64, which should be able to handle larger memory requests, right?

EDIT: Turns out the answer was I was compiling with g++ -std=c00x, which is 32-bit? Not sure, either way I changed the Makefile to remove the -std flag and it compiled fine

tetris11
  • 817
  • 2
  • 11
  • 27
  • 5
    Is your program *compiled* as a 64-bit program? It's possible you're compiling it as a 32-bit program, and thus are still limited to ~2GB of addressable space, despite it running on a 64-bit system. – Cornstalks Jan 30 '13 at 19:02
  • 4
    You aren't using `malloc`, you are using `new`. Your code fails to compile for problems completely unrelated to your question. For example, you use `n` without defining it. Can you actually write a small self contained program that actually compiles and actually demonstrates your problem? – Yakk - Adam Nevraumont Jan 30 '13 at 19:02
  • C++ is using the same underlying system that C uses in this question. http://stackoverflow.com/questions/3463207 – Drew Dormann Jan 30 '13 at 19:03
  • 2
    Oh, and check `sizeof(std::size_t)` from `` -- that is an easy way to see if you are compiling for a 64 or 32 bit system (assuming `char` is 8 bits, and the compiler isn't being silly). – Yakk - Adam Nevraumont Jan 30 '13 at 19:06
  • 2
    Perhaps it cannot find a contiguous chunk of memory large enough. – user7116 Jan 30 '13 at 19:08
  • where is "n" declared? What type is it? – Mats Petersson Jan 30 '13 at 19:12
  • So, on my system, I can allocate up to 8 in the loop [after I multiplied the large starting with 9 by ten, as it wasn't gettting to megabytes otherise]. That makes about 6GB. I have quite a few other things running on the system, so that may well be the limit for one contiguous memory allocation in my system. – Mats Petersson Jan 30 '13 at 19:16
  • @DrewDormann: C++ may (but may not) use the same underlying memory allocation as C see: http://stackoverflow.com/a/240308/14065 – Martin York Jan 30 '13 at 19:18
  • This code does not compile! – Martin York Jan 30 '13 at 19:19
  • It should be able to find a chunk bigger than 2GB of virtual memory in a 64-bit environment. Here is a possibly related question: http://stackoverflow.com/questions/1886802/heap-size-limitation-in-c – sbabbi Jan 30 '13 at 19:36
  • sorry guys, I wrote this example very fast @MatsPetersson -- sorry that's meant to be the function args -- edited – tetris11 Jan 30 '13 at 21:02
  • @Cornstalks -- how do I check? Is it in the Makefile? – tetris11 Jan 30 '13 at 21:06
  • @LokiAstari -- okay try it now, sorry about that – tetris11 Jan 30 '13 at 21:33
  • @Yakk -- okay I tried sizeof(std::size_t), and am getting 8 bytes. What does this mean? Is it meant to be 16 on a 64-bit? – tetris11 Jan 30 '13 at 22:00
  • 1
    @tetris11: On a UNIX-y system, you can do `file ` from the command line to see if it's 32 or 64 bit. `std::size_t` should be at least 8 in a 64-bit program, but it's also technically possible (as far as I know) for it to be 8 in a 32-bit program. Anyway, your program works fine for me on my system (I don't run out of memory), but I'm also compiling in 64-bits and have a ton of free RAM (and OS X may be overcommitting memory; I'm not sure). – Cornstalks Jan 30 '13 at 22:42
  • @LokiAstari I remember that now, thanks. – Drew Dormann Jan 31 '13 at 03:35

2 Answers2

5

malloc is allocating contiguous memory. The fact that you get a badalloc requesting 2 GB of memory doesn't necessarily mean that you have less than 2 GB of memory available, it can also mean that the memory available is fragmented. Try allocating smaller chunks if you want to test how much memory is available to be allocated on your system.

David G
  • 94,763
  • 41
  • 167
  • 253
Étienne
  • 4,773
  • 2
  • 33
  • 58
  • Thing is I've tried the same code on a 32GB 64-bit server and am getting the same results. Surely such a system would have large bouts of unfragmented memory – tetris11 Jan 30 '13 at 21:04
2

If you have increase your swap space size, it's possible to have more space for allocating, when RAM does not have enough memory, operating system grows heap space via swapping with disk space, so for this you might make changes in swap file size, this link explain it for best:

All about Linux swap space

Other best way is to turn overcommitting value on, if the overcommitting value sets to off, you can not allocate all of your memory on your system, with turn overcommitting value on you can allocate all of your memory space for your array.

for turn overcommitting value to on:

set /proc/sys/vm/overcommit_memory to 0

Reza Ebrahimi
  • 3,623
  • 2
  • 27
  • 41