I have set of numbers from 1 to 100 000 000 000. But total amount not more than 50 000. Seems that is possible create array of objects with 50 000 elements. So I could take element from array by index that is very quick - in case if I were have all these numbers from 1 to 50 000. Does it possible use some function that map numbers in range from 1 to 50 000? In this case to find object I have only calculate index!
-
2http://en.wikipedia.org/wiki/Perfect_hash – Oliver Charlesworth Sep 20 '13 at 07:24
-
use modular aritmetic. That way you can map every number from 1 to 100 000 000 000 to a number between 1 and 50 000. – Jakob Sep 20 '13 at 07:28
-
2If you add the 50,000 numbers into an array in sorted order you can use binary search to find if a given number is in the array in about 16 steps. Is that what you're asking about? – Joni Sep 20 '13 at 07:28
-
@Jakob: I suspect the OP is looking for a one-to-one mapping... – Oliver Charlesworth Sep 20 '13 at 07:29
-
@Jakob Plus something like linear probing, otherwise you'd get way too many collisions. – William Gaul Sep 20 '13 at 07:31
-
yes, I looking one-to-one mapping. Is it possible? – user710818 Sep 20 '13 at 07:39
-
I think it is not possible.It means searching the element in constant time.Can you find element`s location in array without comparing with others!,surely no!.And in this way it will be at least linear – WSS Sep 20 '13 at 08:04
-
It's called **hashing**, and in java: `HashMap
(60000)`. An initial capacity of a bit moree than 50000 will allow for imperfectness (be faster). – Joop Eggen Sep 20 '13 at 08:29 -
A one-to-one mapping of those ranges is not possible. See: [Pigeonhole principle](http://en.wikipedia.org/wiki/Pigeonhole_principle). You'll have to deal with collisions. – Blastfurnace Sep 20 '13 at 11:05
1 Answers
What you're describing is a hashmap. It uses pairs where the first value is the "Key" it is referenced by and the second is the "Value" that is actually stored.
//49,999 is the initial size and .8F (80%) is the load factor
//Sizes that are prime numbers are usually more efficient
HashMap<Integer, Long> hashMap = new HashMap<>(49999, .8F);
//Save first value of 1,000,000,00
Integer intKey = 1;
Long lngValue = 1000000000;
hashMap.put(intKey, lngValue);
//Retrieve first value
lngValue = hashMap.get(intKey);
//Retrieve a list of all the Keys saved in the map
Integer[] intKeys = (Integer[])hashMap.keySet().toArray();
//Retrieve a list of all the saved Values
Long[] lngValues = (Long[])hashMap.values().toArray();
Normally you'll want the load factor to be around 70% to 80% so there is plenty of room for it to work efficiently. A higher load factor results in less wasted memory at the expense of performance when it starts filling up.
Derek Banas on YouTube has some great tutorials about how hash functions work and how to use them. They're definitely worth checking out if you aren't too familiar with them yet.
For what you need a Sparse Array
is another option that may be more efficient here. I'm not too familiar with them, but from what I understand it's similar to a linked list. Android has an implementation you may be able to implement and the trove collections have also been suggested in another question.