The problem you are likely to have is making the object pool light weight enough to be cheaper than just creating the objects. You want to the pool to be large enough that you get a fairly high hit rate.
In my experience, you are likely to have problems micro-benchmarking this. When you are creating a single object type repeatedly in a micro-benchmark, you get much better results than when creating a variety of objects in a real/complex application.
The problem with many object pool aproaches is that they a) require a key object, which costs as much or more than creating a simple object, b) involve some synchromization/locking which again can cost as much as creating an object c) require an extra object when adding to the cache (e.g. a Map.Entry), meaning your hit rate has to be much better for the cache to be worth while.
The most light weight, but dumb caching strategy I know is to use an array with a hashcode.
e.g.
private static final int N_POINTS = 10191; // or some large prime.
private static final Point[] POINTS = new Point[N_POINTS];
public static Point of(int x, int y, int z) {
int h = hash(x,y,z); // a simple hash function of x,y,z
int index = (h & 0x7fffffff) % N_POINTS;
Point p = POINTS[index];
if (p != null && p.x == x && p.y == y && p.z == z)
return p;
return POINTS[index] = new Point(x,y,z);
}
Note: the array is not thread safe, but since the Point is immutable, this doesn't matter. The cache works on a best effort basis, and is naturally limited in size with a very simple eviction strategy.
For testing purposes, you can add hit/miss counters to determine the caches effectiveness for you data set.