1

I need to store up to 1.000.000 double values in different arrays (during calculation run). So far I'm using an NSMutableArray but it looks like the memory usage is huge. One idea is to use an c-array in order to avoid the storage of objects in the NSMutableArray. Is there a way to roughly estimate the memory usage of an NSMutableArray vs. and c-array? (I could not find any information about the size of an NSNumber-object vs. an primitive like double or float).

Thanks.

JFS
  • 2,992
  • 3
  • 37
  • 48
  • 3
    Why not run some benchmarks with instruments? – Stavash Aug 26 '13 at 10:45
  • I find instruments not always helpful when it comes to memory usage. I hope to find some theoretical approaches. Is there a way in instruments to see the size of an specific object like the NSMutableArray object? – JFS Aug 26 '13 at 10:51
  • I wonder if a `sizeof(myArray)` tells you the real size? – ott-- Aug 26 '13 at 11:43
  • 1
    Yes, I would guess that a NSArray of NSNumber will take on the order of 30-50 bytes per element. An array of double will take 8. Normally this is not worth worrying about, but with a million values it sort of starts to add up. (You do, of course, have to do much more "manual" heap management with the "raw" arrays.) – Hot Licks Aug 26 '13 at 11:49
  • @HotLicks, could you explain manual heap management? I do not understand. – JFS Aug 26 '13 at 13:45
  • 1
    @JFS It means that you shouldn't allocate space for 1 million doubles in one go (`malloc(1000 * 1000 * sizeof(double))`) but allocating the memory as you need it in smaller chunks, e.g 10 * 1000 doubles. Maybe also reusing old unneeded chunks for new data instead of releasing them. – Sulthan Aug 26 '13 at 14:08
  • Allright, thanks for the hint. – JFS Aug 26 '13 at 14:28
  • I also mean that you must manage your `malloc` and `free` calls without the benefit of reference counts and ARC and autorelease pools, etc. (Though there are a few tricks to get Objective-C memory management to help you.) – Hot Licks Aug 26 '13 at 14:33
  • OK, do you have some good sources to read about using c-arrays in objc-code? It sounds like a tricky challenge. – JFS Aug 26 '13 at 14:37

1 Answers1

5

It's pretty clear that the memory consumption of NSArray will be bigger than of a raw C array.

How big the difference will be? Well, for every value in a NSArray, every primitive double has to be wrapped by a NSNumber so at least 20 B added for every value, probably a bit more.

One estimate can be found here: Memory size of classes in Objective-C

Anyway, storing 1 000 000 values in memory is always a bit strange. Maybe it would be better to store them in a file and then load them when needed (e.g. using a memory mapped file).

Community
  • 1
  • 1
Sulthan
  • 128,090
  • 22
  • 218
  • 270
  • the values get stored during an calculation run. It is an math-app. Is there no definition for memory usage of objects in objc? – JFS Aug 26 '13 at 10:54
  • 1
    Storing 1000000 values in memory is perfectly fine. And files are not a solution when one needs quick access. –  Aug 26 '13 at 11:49
  • @H2CO3 That depends on the use case. If random access is not required, files can be pretty fast, memory mapped files can be also fast enough. – Sulthan Aug 26 '13 at 11:56
  • @H2CO3 Accessing a memory mapped file containing doubles would be much faster than accessing a NSArray containing NSNumbers. And random access is fast, too - provided the system does not need to start "trashing". In fact, you have some sort of page cache with a LRU cache policy. – CouchDeveloper Aug 26 '13 at 13:59
  • @CouchDeveloper Who talks about an `NSArray`? (Well, the author of this answer... not me.) What I meant is of course `double arr[N]`. –  Aug 26 '13 at 15:31