0

I have the following method in Java:

public static Vector2d random(Circle circle) {
    // this returns a random number between 0 and Math.PI * 2
    double angle = MathUtils.random(0, Math.PI * 2);
    // give the point inside the unit circle
    // this returns a normalized vector from a given angle
    Vector2d point = new Vector2d(angle);
    // however, this is only along the edge
    // now add a random magnitude (because this is a normalized vector, we can just multiply it by the desired magnitude)
    double magnitude = Math.random();
    point = point.multiply(magnitude);
    // now expand this to fit the radius
    point = point.multiply(circle.getRadius());
    // now translate by circleCenter
    return point.add(circle.getCenter());
}

This does return a point in the defined circle, however, when you do this many times and plot the points, you can clearly see most points will be toward the center.

Why is this? I don't see how my math can do this.

Comment if you want me to add an image of the points on the plot, if you think that could be helpful.

MCMastery
  • 3,099
  • 2
  • 20
  • 43
  • Yeah, this happens for roughly the same reason that lines of longitude are closer together near the poles than near the equator. A really good way to get evenly distributed random points in a circle is to get random points in the circumscribed square, then reject the ones that fall outside the circle. – Dawood ibn Kareem Aug 02 '17 at 03:24
  • 2
    Try this: Draw a circle (or imagine you are drawing one). Divide into 16 equal slices the way you would slice a pizza. Now, for each radius that goes from the center to the outer edge, divide the segment into 10 equal parts and put 10 evenly spaced dots along the radius. I think you'll notice that the points seem more bunched up near the center. That's effectively what your randomization method is doing. – ajb Aug 02 '17 at 03:37
  • @ajb oh thanks, that helped me visualize it. now i understand why this happens – MCMastery Aug 02 '17 at 03:39

2 Answers2

2

Of course, when r is small, the generated points are closer to each other.

As said by @DBrowne, you can adjust the density by the inverse CDF trick.

Alternatively, you can spare function evaluations by drawing uniform points in [-R,R]x[-R,R] and rejecting the ones such that X²+Y²>R² (about 21% of them). The method generalizes to any shape known by its implicit equation.

1

Your math is flawed. Here's an explanation of why and the correct solution:

The task is to generate uniformly distributed numbers within a circle of radius R in the (x,y) plane. At first polar coordinates seems like a great idea, and the naive solution is to pick a radius r uniformly distributed in [0, R], and then an angle theta uniformly distributed in [0, 2pi]. BUT, you end up with an exess of points near the origin (0, 0)! This is wrong because if we look at a certain angle interval, say [theta, theta+dtheta], there needs to be more points generated further out (at large r), than close to zero. The radius must not be picked from a uniform distribution, but one that goes as

pdf_r = (2/R^2)*r

That's easy enough to do by calculating the inverse of the cumulative distribution, and we get for r:

r = R*sqrt( rand() )

where rand() is a uniform random number in [0, 1]

http://www.anderswallin.net/2009/05/uniform-random-points-in-a-circle-using-polar-coordinates/

DBrowne
  • 683
  • 4
  • 12