1

I am following the guide from https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/streaming_depth_data_from_the_truedepth_camera. It uses AVCaptureDepthDataOutput() to stream depth data from the front camera.

But, two of the edges have wrong data. In portrait mode it would be upper and right edge. In landscape rotated counter-clockwise, it would be left and upper side. Here's an example of how it looks. This is from the tutorial, but I modified a code a little. Note I did print out the actual distance as a number to make sure it wasn't a bug with the video/colors rendering. https://drive.google.com/file/d/1AGFHAZypmHz9136T02ufedz-X3Ct94Kq/view?usp=sharing

I'm okay with this if it's just a necessary artifact of translating from 3D to 2D data. But what I'm looking for is the "correct" way to crop the image so that the 2D depth buffer doesn't have these artifacts at the edges. How do I know how far in to crop? Do I just experiment with it? If so, how would I know it works the same across all devices?

Another issue is that if an object gets too close to the camera, the distance becomes interpreted as the maximum distance instead of minimum. Is this expected?

pete
  • 1,878
  • 2
  • 23
  • 43

1 Answers1

0

Turn off "Smooth depth" and the buffer will be much more clearly defined and consistent (e.g. less than 50 pixels). It would be safe to assume that any value near the edge is not accurate, so inset by some amount similar to 50, you should experiment to see what works best for you.

As far as near by values being reported as the maximum, you should be checking for float.isNan or float.isInfinite and filter those values out. There is a minimum range in which objects closer than that distance can not be detected.

Senseful
  • 86,719
  • 67
  • 308
  • 465