Okay Let's try this again. Originally I asked if anyone had found an answer as I was also looking for resolution on this. Finally though I've been able to figure our the answer to this question:
The answer lies in the image acquisition format. Android till default to a NV21 format, meaning a V byte followed by a U byte interwoven plane after the Y plane. NV12 is an alternate that has the U byte prior to the V byte (see: How to render Android's YUV-NV21 camera image on the background in libgdx with OpenGLES 2.0 in real-time?)
The Image object returned by the ImageReader then creates a ByteBuffer which directly access's this memory location. Calling isDirect() on the ByteBuffer verifies this. You can also see this by modifying vBuffer (getPlane()[2]) index 1. Since index 1 of the vBuffer points to the same memory location of uBuffer index 0, the uBuffer.get(0) will be the same.
Thus the UV plane, VU plane for NV21, is equal to Y-plane/2=length (as noted 6096384 in the example above). If you start with the first V (VU-plane[0]) and count to the last V (VU-plane[length-1]), it will be one less because the last byte in this plane is U. Likewise the first U byte is at index 1 of the VU-plane until VU-plane[length].
Here is an image that hopefully visually explains the above wording:

This is all predicated on the fact that the image is captured in NV21 format and not NV12, or Y12. Here is another good visual for reference (https://www.twblogs.net/a/5d7ede57bd9eee541c3480f3)