O(N) is not the worst case. HashMap is only claimed to have constant work time. In fact its not true. Constant work time is maintained by usual rehashing of a Map when a special threshold exceeds. Have a look at internal HashMap method:
void addEntry(int hash, K key, V value, int bucketIndex) {
if ((size >= threshold) && (null != table[bucketIndex])) {
resize(2 * table.length);
hash = (null != key) ? hash(key) : 0;
bucketIndex = indexFor(hash, table.length);
}
createEntry(hash, key, value, bucketIndex);
}
If size is greater than threshold complete rehashing happens, its complexity is equal to creating new HashMap. So the worst case is
O(new HashMap(oldMap)) + O(N)
O(N) might happen if you have overridden hashCode() function badly, so that it has pure distribution. With default implementations this cant happen. The only danger is in rehashing.