When I tried this:
HashSet<Double> set = new HashSet<>();
Double d1 = new Double(0);
Double d2 = new Double(0);
Double d3 = new Double(-0);
set.add(d1);
System.out.println(set.contains(d2));
System.out.println(set.contains(d3));
The output was what I expected:
true
true
But when I tried:
HashSet<Double> set = new HashSet<>();
Double d1 = new Double(0.0);
Double d2 = new Double(0.0);
Double d3 = new Double(-0.0);
set.add(d1);
System.out.println(set.contains(d2));
System.out.println(set.contains(d3));
or
set.add(Double.valueOf(d1));
System.out.println(set.contains(Double.valueOf(d2)));
System.out.println(set.contains(Double.valueOf(d3)));
To my surprise, the output was:
true
false
Why this happened? How do I make HashSet treat (0.0) and (-0.0) the same?
Is there a better way than if(num == -0.0) num = 0.0;
?