30

I was wondering if Python had a limit on the length of a dictionary key.

For clarification, I'm not talking about the number of keys, but the length of each individual key. I'm going to be building my dictionaries based on dynamic values (after validation), but I'm not sure if I should be taking length into account in this case.

pferate
  • 2,067
  • 1
  • 16
  • 18
  • 4
    As long as the keys are hashable you shouldn't have troubles IMHO. – Paulo Bu Mar 17 '14 at 20:41
  • When you say "length" do you mean length of a string? Because you can have non-string keys, so it may be better to ask your question in terms of memory size – wnnmaw Mar 17 '14 at 20:42
  • In this instance I mean string length. I know that you can have non-string keys, but I was thinking that the limits may be comparable. – pferate Mar 17 '14 at 20:48

3 Answers3

24

There is no such limit in place regarding dictionary keys. Since python also has arbitrary precision on numeric types, the only limit you will encounter, string or otherwise, is that of available memory. You can see another post here for a discussion on maximum string length in python 2.

Community
  • 1
  • 1
Brian
  • 3,091
  • 16
  • 29
22

Here's a bit of sample code:

from string import ascii_letters
from random import choice

def make_str(length):
    return "".join(choice(ascii_letters) for i in range(length))

test_dict = {make_str(10000000): i for i in range(5)}

Conclusion: Python will quite happily use a 10-million-character string as a dict key.

Hugh Bothwell
  • 55,315
  • 8
  • 84
  • 99
1

As I know there's no limit but consider that more the key is long, the more the time to create/access the keys

AlexF
  • 487
  • 1
  • 5
  • 16
  • If you already have the object in memory, storing its reference in the dictionary won't add much overhead. And hashing the key may not depend on the size of it. – jonrsharpe Mar 17 '14 at 20:52
  • I've primarily used string keys in my dictionaries and I've seen that going over 30-40 characters the performance were slightly decreasing – AlexF Mar 17 '14 at 20:54
  • @AlexF Off the top of my head, it could be that the equals checks, which are needed to protect agains hash collisions, are a little more costly with longer strings. – l4mpi Mar 17 '14 at 21:00
  • @l4mpi Yes, it's more or less the same idea I've got but doesn't answer me why even the access is slower. The result is that, when it's possible, I try to use max 30 char lenght for large dictionaries – AlexF Mar 17 '14 at 21:05
  • @AlexF I said *may* not; string hashing does involve the length of the string, see http://effbot.org/zone/python-hash.htm – jonrsharpe Mar 17 '14 at 21:10