I know that is determined by the memory available in the system, and also depending on a good hash function, but in general I'd like to know what is the biggest map you have used, and if it worked well out of the box or needed any adjustment to make it work adequately.
1 Answers
A HashMap
in Java can have a maximum of 2^30 buckets for storing entries - this is because the bucket-assignment technique used by java.util.HashMap
requires the number of buckets to be a power of 2, and since ints are signed in Java, the maximum positive value is 2^31 - 1, so the maximum power of 2 is 2^30.
However, there is in fact no programmatic limit on how many key/value pairs you can store in a HashMap - the size()
function will just stop being accurate once you pass 2^31 - 1. This is because of the way collisions are handled - key/value pairs that land in the same bucket are linked, like nodes in a LinkedList
.
In general, though, if you're getting anywhere close to 2^30 things you need to keep track of in a real-world application, you need a lot more RAM than you can rely on in one machine. The largest HashMap I've ever worked with that sat in a single JVM had a few tens of millions of entries, all very lightweight

- 3,304
- 19
- 26
-
I read this as "there's no limit, but it will stop working correctly once you pass 2^31 - 1." To me that means that there _is_ indeed a limit. You can't just disregard from part of the contract. You can't say that an array of length 10 can hold 20 elements, but start misbehaving after the 10th element. I think [this answer](https://stackoverflow.com/a/4123811/276052) is more accurate. – aioobe Jun 14 '19 at 13:45