0

In the iPhone 7plus the captured depth data is not the same size as the captured image. I am interested in mapping the depth data to the color data. Any help would be greatly appreciated!

  • I think I could help you out but your question is not clear enough. What are you trying to do with the AVDepthData? what is your goal – Eyzuky Apr 01 '18 at 12:23
  • Thank you @Eyzuky. I am trying to generate a point cloud based on the depth map generated by the iPhone. The point cloud will be (hopefully) rendered in VR eg. unity. Thus far, I have been able to pull the depth map from the camera using AVdepthdata. However, the depth map is not the same resolution as the captured image. I was wondering if there is anyway to scale the captured image such its pixels map to the captured depth data. Basically I would like to add color to the point cloud. I've tried a number of things but have not been successful. Thank you in advance for your help. – Michael Archer Apr 01 '18 at 14:40
  • Have you tried converting it to a CIImage? – Eyzuky Apr 01 '18 at 14:42

1 Answers1

3

Convert it to a CIImage:

func depthToCIImage(depthData: AVDepthData) -> CIImage? {

    guard let depthPixelBuffer = depthData.depthDataMap else {return nil}

    return CIImage(cvPixelBuffer: depthPixelBuffer)
}

You can use this guide to resize the image: http://nshipster.com/image-resizing/

I hope this helps you

Eyzuky
  • 1,843
  • 2
  • 22
  • 45
  • Thank you. Would you recommend resizing the depth map or resizing the image? Is there a reason why I would want to only resize the depth map? – Michael Archer Apr 01 '18 at 15:01
  • What if I tried to downsample the color image to match the resolution of the depth image? I'm trying to keep the size of the files low :/ – Michael Archer Apr 01 '18 at 15:18
  • You can save the original depth data and only change its size when needed. This way you don’t have bigger file size saved to disk. – Eyzuky Apr 01 '18 at 15:19