For example, if writing a dictionary class, collisions are rare, but they do exist. As a result, you need to store the key to make sure that when you find your key in the hashtable, it is the right one and not a collision.
Sometimes the keys are long and they're usually strings, so each key can be over 40 bytes, compared to if it was just a hash code. What if the key stored was the object hashed but using a slightly different hashing algorithm, with different prime numbers? Then the chances of a collision would be (1/(2^32)) * (1/(2^32))
.
You can even have another hashing algorithm and store that hash instead, so the chances of a collision would be (1/(2^32)) * (1/(2^32)) * (1/(2^32))
. Obviously a collision could STILL happen, but the chances are so low, and you save so much memory by only having to store a 4 bytes for a key instead of over 32 bytes.
I guess this is still not acceptable though, right, because there's still a CHANCE, but there's also a chance someone's RAM could accidentally flip a bit and bluescreen, and this just seems so unlikely it's tempting not to implement. Are there any alternatives or is the small chance still not worth it?