1

When trying to create an array to use for shared memory in multiple processes, I am getting an assertion error:

    shared_array = RawArray('d', xsize)
  File "C:\Python27\lib\multiprocessing\sharedctypes.py", line 88, in RawArray
    obj = _new_value(type_)
  File "C:\Python27\lib\multiprocessing\sharedctypes.py", line 68, in _new_value
    wrapper = heap.BufferWrapper(size)
  File "C:\Python27\lib\multiprocessing\heap.py", line 242, in __init__
    assert 0 <= size < sys.maxint
AssertionError

It seems that it is above some maxint number, however, even when I run a basic example like below:

from multiprocessing.sharedctypes import RawArray
import sys

xsize = 999999999
#create an empty array    
print('MaxInt:',sys.maxint)
print('My Size:',xsize)
shared_array = RawArray('d', xsize)

The print statements show:

('MaxInt:', 2147483647)
('My Size:', 999999999)

Why is this happening, and how can I make a shared array for multiprocessing when having very large arrays? My computer has 128GB of RAM, so that shouldn't be the issue.

user-2147482637
  • 2,115
  • 8
  • 35
  • 56

1 Answers1

2

an int here will be a "C style" int, which tends to be 4 bytes, but you ask for it, and hence the maximum size of an array by doing:

from sys import maxint
from ctypes import sizeof, c_int

maxint // sizeof(c_int)

which for you (having a 32bit build) will be ~512M elements. However if you are running a 32bit build then your address space will be a much greater limiting factor, and you almost certainly won't be able to allocate an array that big.

I'm using a 64bit builds of Python 2.7 and 3.7, where sys.maxint and sys.maxsize are both 2**63-1, and both allow me to allocate an array with a billion (10**9) elements (and I can see Python having a ~4GB RSS), albeit taking a while to zero it all out.

Sam Mason
  • 15,216
  • 1
  • 41
  • 60
  • I can make another question, but how can this be overcome? It starts as a numpy array, so its clearly possible to have a large array. – user-2147482637 May 09 '19 at 15:49
  • get a 64 bit build of Python! also [Python2.7 is very outdated](https://pythonclock.org/), you should think about moving to more recent versions! – Sam Mason May 09 '19 at 15:54
  • I just had a near identical comment on another post... its not like I control the software and packages that everyone uses. – user-2147482637 May 09 '19 at 16:03
  • what does that mean? define "everyone"! I'd encourage you to, at very least, get control over the versions of Python and packages installed on your machine (or whatever machines you have access to). you can install things to your home/user directory if you don't have system wide permissions. – Sam Mason May 09 '19 at 16:41
  • "everyone" means I can't go around to companies and homes changing billions of computers to python 64-bit. changing my python version doesn't change the other persons, it just means the code won't work for them. – user-2147482637 May 09 '19 at 17:23
  • this is going way off topic, but I'd suggest getting your code running on your machine before worrying too much about everybody else. the last time I used MS Windows it didn't come with Python, so you'd have to package an interpreter with your code, assuming you intend to stay in that myopic world, so you get to dictate that anyway – Sam Mason May 09 '19 at 18:34