I had some code I was profiling and was surprised at how much time was being spent on Math.min(float, float)
.
In my use case I needed to get the min of 3 float values, each value is guaranteed to not be NAN
or another edge case float value.
My original method was:
private static float min2(float v1, float v2, float v3) {
return Math.min(Math.min(v1,v2),v3);
}
But I found that this was about 5x faster:
private static float min1(float v1, float v2, float v3) {
if (v1 < v2 && v1 < v3) {
return v1;
}
else if (v2 < v3) {
return v2;
}
else {
return v3;
}
}
For reference this is the code for Math.min:
public static float min(float f1, float f2) {
if (f1 > f2) {
return f2;
}
if (f1 < f2) {
return f1;
}
/* if either arg is NaN, return NaN */
if (f1 != f2) {
return Float.NaN;
}
/* min(+0.0,-0.0) == -0.0 */
/* 0x80000000 == Float.floatToRawIntBits(-0.0f) */
if (Float.floatToRawIntBits(f1) == 0x80000000) {
return -0.0f;
}
return f2;
}
Note: My use case was symmetric and the above was all true for max instead of min.
EDIT1: It turns out ~5x was an overstatement, but I am still seeing a speed difference inside my application. Although I suspect that may be due to not having a proper timing test.
After posting this question I wrote a proper micro optimization speed test in a separate project. Tested each method 1000 times on random floats and they both took the same amount of time. I don't think it would be useful to post that code as it's just confirming what we all already thought.
There must be something specific to the project I'm working on causing the speed difference.
I'm doing some graphic work in an Android app, and I was finding the min/max of the values from 3 touch events. Again, edge cases like -0.0f
and the different infinities are not an issue here. Values range between 0.0f and say 3000f.
Originally I profiled my code using the Android Device Monitor's Method Profiling tool, which did show a ~5x difference. But, this isn't the best way to micro-profile code as I have now learned.
I added the below code inside my application to attempt to get better data:
long min1Times = 0L;
long min2Times = 0L;
...
// loop assigning touch values to v1, v2, v3
long start1 = System.nanoTime();
float min1 = min1(v1, v2, v3);
long end1 = System.nanoTime();
min1Times += end1 - start1;
long start2 = System.nanoTime();
float min2 = min2(v1, v2, v3);
long end2 = System.nanoTime();
min2Times += end2 - start2;
double ratio = (double) (min1Times) / (double) (min2Times);
Log.d("", "ratio: " + ratio);
This prints a running ratio with each new touch event. As I swirl my finger on the screen, the first ratios logged are either 0.0
or Infinity
or NaN
. Which makes me think this test isn't very accurately measuring the time. As more data is collected the ratio tends to vary between .85
and 1.15
.