5

I have a HashMap similar to this:

 HashMap<Integer, Integer> map = new HashMap<Integer, Integer>();
 map.put(3, 2);
 map.put(7, 2);
 map.put(5, 1);
 map.put(10, 4);

I need to sort it first by value and then by key in case multiple keys share the same value.

The result should look like this:

(5, 1)
(3, 2)
(7, 2)
(10, 4)

Any suggestion please?

I'm comparing the values first and the keys only in case of a duplicate value. So I'm using using both the keys and the values, not just the values.

Bubletan
  • 3,833
  • 6
  • 25
  • 33
AbdallahRizk
  • 107
  • 1
  • 5
  • You have to use a LinkedHashMap - because HashMap cannot maintain an order. – Ari Singh Feb 15 '18 at 23:48
  • @AriSingh not quite a duplicate as there's a subtle difference with this question I believe. – Ousmane D. Feb 15 '18 at 23:48
  • @Aominè How is it different - Just because the values are duplicate ? Duplicate values do not change the answer/solution. Just the comparator has to do one more check. – Ari Singh Feb 15 '18 at 23:50

1 Answers1

8

You can achieve it like so with the streams API:

LinkedHashMap<Integer, Integer> resultSet = 
  map.entrySet().stream()
                .sorted(Map.Entry.<Integer, Integer>comparingByValue()
                             .thenComparing(Map.Entry.comparingByKey()))
                .collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue,
                        (oldValue, newValue) -> oldValue, LinkedHashMap::new));

The sorted intermediate operation can take a Comparator instance which in this case is:

Map.Entry.<Integer, Integer>comparingByValue()
                                 .thenComparing(Map.Entry.comparingByKey())

i.e. compare by the map values and if two given values are the same then compare by the key.

Moving on to the collect terminal operation:

Map.Entry::getKey is the keyMapper which is a mapping function to produce the map keys.

Map.Entry::getValue is the valueMapper which is a mapping function to produce the map values.

(oldValue, newValue) -> oldValue is the merge function used to resolve collisions between values associated with the same key.

LinkedHashMap::new provides a new empty Map into which the results will be inserted. We specified a LinkedHashMap here to maintain insertion order.


Note the merge function (oldValue, newValue) -> oldValue is pointless in this case as the source of the stream (map.entrySet()) will never contain duplicate keys. However, we still need it otherwise we cannot specify the type of accumulator for the new values, in this case, a LinkedHashMap instance to maintain insertion order.

You may also want to look at the toMap method for further information on how it works.

Ousmane D.
  • 54,915
  • 8
  • 91
  • 126