I have a matrix class which takes in a generic object which extends Number.
For example:
public class Matrix<T extends Number>
I am trying to compare two matrices which have the same values:
Matrix:
row=[0] 273 455
row=[1] 243 235
row=[2] 244 205
row=[3] 102 160
and
Matrix:
row=[0] 273 455
row=[1] 243 235
row=[2] 244 205
row=[3] 102 160
In the Matrix class, I have a equals method which looks like this:
public boolean equals(Object obj) {
if (obj == null)
return false;
if (!(obj instanceof Matrix))
return false;
Matrix<T> m = (Matrix<T>) obj;
if (this.rows != m.rows)
return false;
if (this.cols != m.cols)
return false;
for (int i=0; i<matrix.length; i++) {
T t1 = matrix[i];
T t2 = m.matrix[i];
if (!t1.equals(t2))
return false;
}
return true;
}
This line is failing:
t1.equals(t2)
even when the two numbers are equal. e.g. "273" and "273"
When I debug the equals method, it is failing because it is assuming the numbers are Longs:
This is from Java SDK Long.class:
public boolean equals(Object obj) {
if (obj instanceof Long) {
return value == ((Long)obj).longValue();
}
return false;
}
Essentially, it fails because the obj isn't an instance of Long.
I can easily change my equals method to do:
if (t1.longValue()!=t2.longValue())
return false;
But I am wonder what is the correct way to check for equality in this situation and why the equals method on the generic T is assuming it's a Long.
EDIT:
My testing code is defining ''Matrix generic type of Integer'' which makes the equality testing (which is comparing using Long) strange to me.
Testing code:
Matrix<Integer> matrix1 = new Matrix<Integer>(4, 3);
matrix1.set(0, 0, 14);
matrix1.set(0, 1, 9);
matrix1.set(0, 2, 3);
matrix1.set(1, 0, 2);
matrix1.set(1, 1, 11);
matrix1.set(1, 2, 15);
matrix1.set(2, 0, 0);
matrix1.set(2, 1, 12);
matrix1.set(2, 2, 17);
matrix1.set(3, 0, 5);
matrix1.set(3, 1, 2);
matrix1.set(3, 2, 3);
Matrix<Integer> matrix2 = new Matrix<Integer>(3, 2);
matrix2.set(0, 0, 12);
matrix2.set(0, 1, 25);
matrix2.set(1, 0, 9);
matrix2.set(1, 1, 10);
matrix2.set(2, 0, 8);
matrix2.set(2, 1, 5);
Matrix<Integer> result1 = new Matrix<Integer>(4,2);
result1.set(0, 0, 273);
result1.set(0, 1, 455);
result1.set(1, 0, 243);
result1.set(1, 1, 235);
result1.set(2, 0, 244);
result1.set(2, 1, 205);
result1.set(3, 0, 102);
result1.set(3, 1, 160);
Matrix<Integer> matrix3 = matrix1.multiply(matrix2);
if (!matrix3.equals(result1)) {
System.err.println("Matrix multiplication error. matrix3="+matrix3+" result1"+result1);
return false;
}
Here is the link to the Matrix code without the equals() method defined. I haven't checked in the equals() code yet.