From Fluent Python...
To fetch the value at my_dict[search_key] , Python calls hash(search_key) to obtain the hash value of search_key and uses the least significant bits of that number as an offset to look up a bucket in the hash table (the number of bits used depends on the current size of the table). If the found bucket is empty, KeyError is raised.
If only the least significant bits of the hash value are used, is it possible that an empty bucket and non-empty bucket share the same least significant bits and a KeyError is mistakenly raised because the empty bucket was encountered first?
What does using something as an "offset" mean in this context? Please provide an example.