4

I have a class which requiring a large amount of memory.

class BigClass {
public:
    BigClass() {
        bf1[96000000-1] = 1;
    }
    double bf1[96000000];
};

I can only initiate the class by "new" a object in heap memory.

BigClass *c = new BigClass();
assert( c->bf1[96000000-1] == 1 );
delete c;

If I initiate it without "new". I will get a segmentation fault in runtime.

BigClass c; // SIGSEGV!

How can I determine the memory limit? or should I better always use "new"?

sbi
  • 219,715
  • 46
  • 258
  • 445
rnd_nr_gen
  • 2,203
  • 3
  • 36
  • 55
  • just a guess that the limit may actually come from the heap that is provided by the OS when the program is running – clamp Nov 05 '10 at 14:13
  • @clamp You should not confuse the heap memory (used by the `new` operator) with the stack memory used for automatic variables as in `BigClass c;` – Luca Martini Nov 05 '10 at 14:20

6 Answers6

3

First of all since you've entitled this C++ and not C why are you using arrays? Instead may I suggest vector<double> or, if contiguous memory is causing problems deque<double> which relaxes the constraint on contiguous memory without removing the nearly constant time lookup.

Using vector or deque may also alleviate other seg fault issues which could plague your project at a later date. For instance, overrunning bounds in your array. If you convert to using vector or deque you can use the .at(x) member function to retrieve and set values in your collection. Should you attempt to write out of bounds, that function will throw an error.

wheaties
  • 35,646
  • 15
  • 94
  • 131
  • 1
    In addition, if using `vector`, you may want to call `vector::reserve` if you'll be using that much memory. – user470379 Nov 05 '10 at 14:19
  • @user470379 good point but only if you need to do `push_back`. I think the sizes here will be static. – wheaties Nov 05 '10 at 14:26
  • 1
    @wheaties, however your second paragraph seems off-base. The most likely explanation for the segmentation fault is that a single `BigClass` instance is larger than the remaining stack space. – Jon-Eric Nov 05 '10 at 14:45
  • @wheaties Explain why the static size makes a difference. AFAIK `reserve` is the only way to reserve space for elements without actually initializing them. – user470379 Nov 05 '10 at 14:58
  • @Jon-Eric you're right. I most likely is that the item is too large to place on the stack. I'll remove that. – wheaties Nov 05 '10 at 15:01
  • @user470379 you use `reserve` generally when you want to have the option of appending more items to your list while avoiding the copy overhead of transcribing an old vector contiguous array to the newer, larger version. Here, it appears that the OP wants a statically sized rather than a dynamically sized array. Vectors allow you within the constructor call to set a specific size at construction. – wheaties Nov 05 '10 at 15:06
  • The vector constructor only allow you to specify the size if you also provide a value with which it initializes each and every member. If you'll be filling the vector with your data, initializing every member is wasteful. – user470379 Nov 05 '10 at 15:17
2

The stack have a fixed size that is dependant on the compiler options. See your compiler documentation to change the stack size for your executable.

Anyway, for big objects, prefer using new or better : smart pointers like shared_pointer (from boost or from std::tr1 or std:: if you have very recent compiler).

Klaim
  • 67,274
  • 36
  • 133
  • 188
1

There is no platform-independent way of determining the memory limit. For "large" amounts of memory, you're far safer allocating on the heap (i.e. using new); you can check for success by comparing the resulting pointer against NULL, or catching std::bad_alloc exceptions.

Oliver Charlesworth
  • 267,707
  • 33
  • 569
  • 680
  • 2
    Careful, depending on the allocator used, `operator new` can also throw a `bad_alloc` exception instead of returning `NULL` on failure. See the C++ spec, ISO/IEC 14882:2003(E) section 5.3.4.13. – Wyatt Anderson Nov 05 '10 at 14:20
1

You shouldn't play that game ever. Your code could be called from another function or on a thread with a lower stack size limit and then your code will break nastily. See this closely related question.

If you're in doubt use heap-allocation (new) - either directly with smart pointers (like auto_ptr) or indirectly using std::vector.

Community
  • 1
  • 1
sharptooth
  • 167,383
  • 100
  • 513
  • 979
1

The way your class is designed is, as you discovered, quite fragile. Instead of always allocating your objects on the heap, instead your class itself should allocate the huge memory block on the heap, preferably with std::vector, or possibly with a shared_ptr if vector doesn't work for some reason. Then you don't have to worry about how your clients use the object, it's safe to put on the stack or the heap.

Mark B
  • 95,107
  • 10
  • 109
  • 188
0

On Linux, in the Bash shell, you can check the stack size with ulimit -s. Variables with automatic storage duration will have their space allocated on the stack. As others have said, there are better ways of approaching this:

  1. Use a std::vector to hold your data inside your BigClass.
  2. Allocate the memory for bf1 inside BigClass's constructor and then free it in the destructor.
  3. If you must have a large double[] member, allocate an instance of BigClass with some kind of smart pointer; if you don't need shared access something as simple as std::auto_ptr will let you safely construct/destroy your object:

    std::auto_ptr<BigClass>(new BigClass) myBigClass;
    myBigClass->bf1; // your array
    
Wyatt Anderson
  • 9,612
  • 1
  • 22
  • 25