Pre-allocating the list ensures that the allocated index values will work. I assume that's what you mean by preallocating a dict. In that case:
d = dict.fromkeys(range(1000))
or use any other sequence of keys you have handy. If you want to preallocate a value other than None you can do that too:
d = dict.fromkeys(range(1000), 0)
Edit as you've edited your question to clarify that you meant to preallocate the memory, then the answer to that question is no, you cannot preallocate the memory, nor would it be useful to do that. Most of the memory used isn't the dictionary itself, it will be the objects used as keys and values. The dictionary itself allocates memory in a way that is effectively constant time (so it starts off small but then resizes in larger chunks in such a way that the overall time is effectively constant).
Allocating 30 million objects to a dictionary will require approximately 120MB or 240MB for the dict itself but the individual objects will require a lot more so unless you have a lot of RAM in your system I would think it will be the content of the dictionary that gives you a problem rather than the dictionary itself.
If you fire up the interactive prompt you'll find that it only takes a few seconds to run this:
>>> d = dict.fromkeys(range(30000000))
>>> import sys
>>> sys.getsizeof(d)
1610613016
So 1,610,613,016 bytes (1.5GB) for a dictionary that contains only integer keys and all the values are None
. Store unique values as well and you've double the size if they're just integers as well but if they're strings or complex objects your memory consumption will be very high.