0

I am trying java lambda expresssions to remove duplicate without using distinct.

Here is my solution:

        public static List<Integer> dropDuplicates(List<Integer> list) {
        return list
                .stream()
                .collect(Collectors.groupingBy(Function.identity()))
                .values()
                .stream()
                .map(v -> v.stream().findFirst().get())
                .collect(toList());
    }

It is working fine but order of elements are changed.

      List<Integer> list = Arrays.asList(11, 12, 1, 2, 2, 3,12,  4, 13,  4, 13);
       output => [1, 2, 3, 4, 11, 12, 13]

I am bit new to java functional programming(maybe this is stupid question). Is any way to preserve order of the list elements or other better way to do this?

Sky
  • 2,509
  • 1
  • 19
  • 28
  • 1
    Why would you do this without using `distinct`? – marstran Feb 08 '21 at 09:57
  • ...and why would you want to specifically use `Stream` for this? – Naman Feb 08 '21 at 10:27
  • @marstran I am just exploring JAVA lambda expressions. I can do this in single line in Scala: list.foldLeft[List[Int]](Nil)((acc, element) => if (acc.contains(element)) acc else acc :+ element) Just Thinking Java. – Sky Feb 08 '21 at 12:10
  • @Naman Is there other way to do it ? – Sky Feb 08 '21 at 12:12

2 Answers2

0

Your issue comes from the fact that the Map interface doesn't have any guarantees when it comes to the order of elements. If you care about the order, you need to make sure that an ordered implementation is used (like LinkedHashMap). The following implementation preserved the order:

public static List<Integer> dropDuplicates(List<Integer> list) {
    return list
            .stream()
            .collect(Collectors.groupingBy(Function.identity(), LinkedHashMap::new, Collectors.toList()))
            .values()
            .stream()
            .map(v -> v.stream().findFirst().get())
            .collect(Collectors.toList());
}

Of course, using distinct() or what Ernest suggested would be a way simpler solution. Just wanted to build on what you've done so far.

Amongalen
  • 3,101
  • 14
  • 20
0

You can use the filter function with a predicate that keeps track of which elements you have already seen:

public <T> Predicate<T> distinct() {
    final Set<T> seen = new HashSet<>();

    return t -> {
        if (seen.contains(t)) {
            return false;
        }

        seen.add(t);
        return true;
    }
}

Now you can do this:

list.stream()
    .filter(distinct())
    .collect(toList());
marstran
  • 26,413
  • 5
  • 61
  • 67