15

According to this question a .Net dictionary resizes its allocated space to prime numbers that are at least twice the current size. Why is it important to use prime numbers and not just twice the current size? (I tried to use my google-fu powers to find an answer, but to no avail)

Theodor Zoulias
  • 34,835
  • 7
  • 69
  • 104
maayank
  • 4,200
  • 2
  • 24
  • 23

3 Answers3

17

The bucket in which an element is put is determined by (hash & 0x7FFFFFF) % capacity. This needs to be uniformly distributed. From this it follows that if multiple entries which are a multiple of a certain base (hash1 = x1 * base, hash2 = x2 * base,...) where base and capacity aren't coprime (greatest common divisor > 1) some slots are over used, and some are never used. Since prime numbers are coprime to any number except themselves, they have relatively good chances of achieving a good distribution.

One particularly nice property of this is that for capacity > 30 the contribution of each bit to the hashcode is different. So if the variation of the hash is concentrated in only a few bits it will still lead to a good distribution. This explains why capacities which are powers of two are bad: they mask out the high bits. A set of numbers where only the high bits are different isn't that unlikely.

Personally I think they choose that function badly. It contains an expensive modulo operation and if the entries are multiples of the prime-capacity its performance breaks down. But it seems to be good enough for most applications.

CodesInChaos
  • 106,488
  • 23
  • 218
  • 262
11

It is an algorithm implementation detail related to choosing a good hashing function and which provides uniform distribution. A non-uniform distribution increases the number of collisions, and the cost of resolving them.

Darin Dimitrov
  • 1,023,142
  • 271
  • 3,287
  • 2,928
  • 7
    Choosing prime number does **not** provide uniform distribution, no need to oversimplify. With `hashsize = prime_number`, you have absolutely same chance of getting collisions as with `hashsize = 2^k` or any other. It's just that some hash sizes make collisions look 'unpredictable', 'random' or 'uniformly distributed'. On the other hand, having `hashsize = 2^k` would mean that any hash function based on xor will suck. – Nikita Rybak Jan 09 '11 at 11:19
5

Because of the mathematics of prime numbers.They can not be factored into different smaller numbers. When you divide the hash number from the stored items you thus get an equal distribution. If you would not have a prime number, depending on the objects, the distribution may not be even.

TomTom
  • 61,059
  • 10
  • 88
  • 148