To my understanding, it's not about detecting memory leaks really, it's about just managing memory altogether. (As for memory leaks, obviously, you shoudn't hang on to a Java object in your native code longer than you have to, global ref or not) Usage of global references is somewhat of an antipattern (when used for problems that can be solved by other means) in most languages, afaik.
Thing with JNI is that your native code is in slave mode, so to say. The JVM is unable to clean up stuff you hold on to in your C code, especially global refs ('cause their lifecycle is indefinite), so these limits are a measure to prevent developers from abusing JNI and shooting themselves in the foot.
Also, I'm a bit curious on what task you're trying to accomplish that you have to store over 65k global references -- I have a feeling that what you're trying to accomplish allows for a different approach, one that doesn't push JNI over the edge.
E.g. right now I'm wiring a native database library to an Android app, the whole setup has to throw hundreds of thousands of db records back and forth and even this case has a solution that doesn't overflow even the local ref table (512 ref limit) let alone the global ref table.
I know it's probably late, but if you care to share your actual task, we might be able to come up with a proper way of dealing with it.