Edit: I've left this question here, though I've subsequently figured out the problem was the value I was sending as minDistance was the problem. The code below works as expected with no weird behaviour.
I'm stumped by (too me) weird behaviour from a JavaScript calculation. What I want to do is get a random point (whose coordinates I've called x2 and y2) which is a minimum distance from a given point (x1 and y1). My code below usually works, but in about 1 in 20 tests it returns a point too close.
This isn't simply a floating point rounding error since the incorrect return points are often under half the minimum distance away.
function random_distance(x1, y1, minDistance) {
let x2;
let y2;
let d;
do {
x2 = Math.random() * (canvas.width + 1);
y2 = Math.random() * (canvas.height + 1);
d = Math.hypot(x2 - x1, y2 - y1);
} while (d < minDistance);
return [x2, y2];
}
Checks I've put within the function seem to believe the x2, y2 points are valid, so I suspect this is some referencing problem where the returned [x2, y2] array doesn't contain the last calculated values, but my JavaScript knowledge isn't sufficient to figure out what's going on here.