This is the next step from Element Distinctness Problem, which is discussed thoroughly in this thread: Find duplicates in an array, including lower bounds for the problem (cannot do better than O(nlogn)
without a hash set involved).
If you are unwilling to use a hash-set to check out all the elements you have already seen, your best bet is to sort the array, and then iterate it - all duplicate elements will be adjacent to each other.
public static int[] justUniques(int[] arr) {
if (arr == null || arr.length == 0) return arr;
Arrays.sort(arr);
int n = 1;
for (int i = 1; i < arr.length; i++) {
if (arr[i] != arr[i-1]) n++;
}
int[] res = new int[n];
res[0] = arr[0];
n = 1;
for (int i = 1; i < arr.length; i++) {
if (arr[i] != arr[i-1]) res[n++] = arr[i];
}
return res;
}
Note that a simple variation of the above can also do it in-place, without creating a new array.
This solution is O(nlogn)
, and thus is optimal. You can implement your own sorting algorithm (it's fairly easy) if you are unwilling to use the Arrays.sort()
one.
Another related thread that asks similar question with an additional restriction: Removing the duplicates from an array without disturbing the order of elements without using Sets