-2

I need to allocate memory of order of 10^15 to store integers which can be of long long type. If i use an array and declare something like

long long  a[1000000000000000];

that's never going to work. So how can i allocate such a huge amount of memory.

Mat
  • 202,337
  • 40
  • 393
  • 406
sandyroddick
  • 77
  • 4
  • 16
  • 19
    Do you have 7.5 million GB of memory? – huon Sep 02 '12 at 08:19
  • That much memory would be VERY expensive. If you have 8000 1TB hard drives, the best you can do is store this in files. In which case you should be finding out about read/writing to files. – ronalchn Sep 02 '12 at 08:24

4 Answers4

7

Really large arrays generally aren't a job for memory, more one for disk. 1015 array elements at 64 bits apiece is (I think) 8 petabytes. You can pick up 8G memory slices for about $15 at the moment so, even if your machine could handle that much memory or address space, you'd be outlaying about $15 million dollars.

In addition, with upcoming DDR4 being clocked up to about 4GT/s (giga-transfers), even if each transfer was a 64-bit value, it would still take about one million seconds just to initialise that array to zero. Do you really want to be waiting around for eleven and a half days before your code even starts doing anything useful?

And, even if you go the disk route, that's quite a bit. At (roughly) $50 per TB, you're still looking at $400,000 and you'll possibly have to provide your own software for managing those 8,000 disks somehow. And I'm not even going to contemplate figuring out how long it would take to initialise the array on disk.

You may want to think about rephrasing your question to indicate the actual problem rather than what you currently have, a proposed solution. It may be that you don't need that much storage at all.

For example, if you're talking about an array where many of the values are left at zero, a sparse array is one way to go.

paxdiablo
  • 854,327
  • 234
  • 1,573
  • 1,953
2

You can't. You don't have all this memory, and you'll don't have it for a while. Simple.

EDIT: If you really want to work with data that does not fit into your RAM, you can use some library that work with mass storage data, like stxxl, but it will work a lot slower, and you have always disk size limits.

Jepessen
  • 11,744
  • 14
  • 82
  • 149
2

MPI is what you need, that's actually a small size for parallel computing problems the blue gene Q monster at Lawerence Livermore National Labs holds around 1.5 PB of ram. you need to use block decomposition to divide up your problem and viola!

the basic approach is dividing up the array into equal blocks or chunks among many processors

pyCthon
  • 11,746
  • 20
  • 73
  • 135
1

You need to uppgrade to a 64-bit system. Then get 64-bit-capable compiler then put a l at the end of 100000000000000000.

Have you heard of sparse matrix implementation? In one of the sparse matrices, you just use very little part of the matrix despite of the matrix being huge.

Here are some libraries for you.

Here is a basic info about sparse-matrices You dont actually use all of it. Just the needed few points.

Community
  • 1
  • 1
huseyin tugrul buyukisik
  • 11,469
  • 4
  • 45
  • 97