What is the maximum size of LinkedList in a HashSet and what happens when that max size is reached, if any? If all the n input elements have hash codes that store values in the same node array of the hash map. i.e what happens when due to a specific input , bucket 0 keeps on growing and rest of the buckets are unfilled.Is rehashing done in that case or is there a specific way to avoid this problem?
Asked
Active
Viewed 1,180 times
2
-
In Java <= 7, **Unlimited**. In Java 8+, the linked list will be converted to balanced tree, though even there, it may in effect be an unlimited linked list. – Andreas Oct 12 '18 at 21:37
-
@Andreas , so how does it get element in O(1) in unlimited linked list? – amipro Oct 12 '18 at 21:50
-
1it's actually *amortized* `O(1)`, when searching inside a tree you still get `O(logn)` – Eugene Oct 12 '18 at 21:55
-
1It doesn't, not truly. But with adequate hashing function, the probability of hash bucket collision is small enough that the *amortized* complexity can be considered to be _O(1)_. With a very bad hashing function, all objects may end up in the same bucket, and complexity will be _O(n)_. But, with hashing collections, we assume adequate hashing function, and hence _O(1)_, because otherwise we wouldn't use them. – Andreas Oct 12 '18 at 22:01
-
That's a linked list not `LinkedList`. It's not bad hashing functions that are the real problem but malicious selected keys (or elements for `HashSet`). Only works if all the keys implement `Comparable` appropriately. – Tom Hawtin - tackline Oct 12 '18 at 22:30
1 Answers
1
The strategy is somehow implementation specific, but in general when a HashMap
(and HashSet
is based on that) reaches 64 entries overall and 8 entries in a single bucket, it will be transformed to a Tree
. Until than a resize
happens, when a bucket is doubled in size, thus an extra bit is taken into consideration of where to place an entry - this is called rehash
- this is done to try and move entries to different buckets.

Eugene
- 117,005
- 15
- 201
- 306